[ 531.623994] env[67905]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 532.240534] env[67964]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 533.575419] env[67964]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67964) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 533.575736] env[67964]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67964) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 533.575876] env[67964]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67964) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 533.576207] env[67964]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 533.780439] env[67964]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67964) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 533.790578] env[67964]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=67964) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 533.895986] env[67964]: INFO nova.virt.driver [None req-eb8ffb3c-c593-46b4-883d-7b6d70ca814c None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 533.970698] env[67964]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 533.970865] env[67964]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 533.970965] env[67964]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67964) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 536.763708] env[67964]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-a085959a-2845-4141-a035-444c6fea9839 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.783484] env[67964]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67964) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 536.783684] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-771ae91b-7a09-432b-a853-2f7e59d04137 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.823759] env[67964]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 697ba. [ 536.823977] env[67964]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.853s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 536.824441] env[67964]: INFO nova.virt.vmwareapi.driver [None req-eb8ffb3c-c593-46b4-883d-7b6d70ca814c None None] VMware vCenter version: 7.0.3 [ 536.827820] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-063c688f-a04a-4aed-84c0-ea3ff8c6eff3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.848877] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0286d82-9d83-487e-97b6-082de76251f9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.854720] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-494498d6-e0a3-43e1-9781-a3ab8da3a10b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.861355] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d23e7e1d-5575-4b4d-9b41-4bdf4b86d281 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.874289] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4d8b7f6-67b8-4905-8b6d-73ad46958c2e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.880344] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44c9a32f-dbee-49d4-b76b-09ab704f0656 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.910485] env[67964]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-c97a7097-0863-4d7a-8dd0-799c4c0be053 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.915377] env[67964]: DEBUG nova.virt.vmwareapi.driver [None req-eb8ffb3c-c593-46b4-883d-7b6d70ca814c None None] Extension org.openstack.compute already exists. {{(pid=67964) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 536.917934] env[67964]: INFO nova.compute.provider_config [None req-eb8ffb3c-c593-46b4-883d-7b6d70ca814c None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 536.935747] env[67964]: DEBUG nova.context [None req-eb8ffb3c-c593-46b4-883d-7b6d70ca814c None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),fcf100b6-6d5c-4595-b5a6-17192ebda685(cell1) {{(pid=67964) load_cells /opt/stack/nova/nova/context.py:464}} [ 536.937594] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 536.937809] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 536.938501] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 536.938908] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Acquiring lock "fcf100b6-6d5c-4595-b5a6-17192ebda685" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 536.939109] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Lock "fcf100b6-6d5c-4595-b5a6-17192ebda685" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 536.940112] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Lock "fcf100b6-6d5c-4595-b5a6-17192ebda685" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 536.964662] env[67964]: INFO dbcounter [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Registered counter for database nova_cell0 [ 536.973093] env[67964]: INFO dbcounter [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Registered counter for database nova_cell1 [ 536.976108] env[67964]: DEBUG oslo_db.sqlalchemy.engines [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67964) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 536.976501] env[67964]: DEBUG oslo_db.sqlalchemy.engines [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67964) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 536.981055] env[67964]: DEBUG dbcounter [-] [67964] Writer thread running {{(pid=67964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 536.981822] env[67964]: DEBUG dbcounter [-] [67964] Writer thread running {{(pid=67964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 536.984148] env[67964]: ERROR nova.db.main.api [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 536.984148] env[67964]: result = function(*args, **kwargs) [ 536.984148] env[67964]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 536.984148] env[67964]: return func(*args, **kwargs) [ 536.984148] env[67964]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 536.984148] env[67964]: result = fn(*args, **kwargs) [ 536.984148] env[67964]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 536.984148] env[67964]: return f(*args, **kwargs) [ 536.984148] env[67964]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 536.984148] env[67964]: return db.service_get_minimum_version(context, binaries) [ 536.984148] env[67964]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 536.984148] env[67964]: _check_db_access() [ 536.984148] env[67964]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 536.984148] env[67964]: stacktrace = ''.join(traceback.format_stack()) [ 536.984148] env[67964]: [ 536.985153] env[67964]: ERROR nova.db.main.api [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 536.985153] env[67964]: result = function(*args, **kwargs) [ 536.985153] env[67964]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 536.985153] env[67964]: return func(*args, **kwargs) [ 536.985153] env[67964]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 536.985153] env[67964]: result = fn(*args, **kwargs) [ 536.985153] env[67964]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 536.985153] env[67964]: return f(*args, **kwargs) [ 536.985153] env[67964]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 536.985153] env[67964]: return db.service_get_minimum_version(context, binaries) [ 536.985153] env[67964]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 536.985153] env[67964]: _check_db_access() [ 536.985153] env[67964]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 536.985153] env[67964]: stacktrace = ''.join(traceback.format_stack()) [ 536.985153] env[67964]: [ 536.985778] env[67964]: WARNING nova.objects.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 536.985955] env[67964]: WARNING nova.objects.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Failed to get minimum service version for cell fcf100b6-6d5c-4595-b5a6-17192ebda685 [ 536.986406] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Acquiring lock "singleton_lock" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 536.986639] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Acquired lock "singleton_lock" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 536.986966] env[67964]: DEBUG oslo_concurrency.lockutils [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Releasing lock "singleton_lock" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 536.987381] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Full set of CONF: {{(pid=67964) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 536.987596] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ******************************************************************************** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 536.987805] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] Configuration options gathered from: {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 536.988034] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 536.988317] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 536.988524] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ================================================================================ {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 536.988806] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] allow_resize_to_same_host = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.989071] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] arq_binding_timeout = 300 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.989294] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] backdoor_port = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.989511] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] backdoor_socket = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.989759] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] block_device_allocate_retries = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.990014] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] block_device_allocate_retries_interval = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.990273] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cert = self.pem {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.990524] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.990775] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute_monitors = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.991039] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] config_dir = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.991301] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] config_drive_format = iso9660 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.991513] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.991764] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] config_source = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.992023] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] console_host = devstack {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.992279] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] control_exchange = nova {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.992519] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cpu_allocation_ratio = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.992771] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] daemon = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.993022] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] debug = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.993271] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] default_access_ip_network_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.993522] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] default_availability_zone = nova {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.993763] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] default_ephemeral_format = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.994009] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] default_green_pool_size = 1000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.994346] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.994610] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] default_schedule_zone = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.994857] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] disk_allocation_ratio = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.995120] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] enable_new_services = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.995389] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] enabled_apis = ['osapi_compute'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.995637] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] enabled_ssl_apis = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.995879] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] flat_injected = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.996134] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] force_config_drive = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.996379] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] force_raw_images = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.996631] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] graceful_shutdown_timeout = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.996874] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] heal_instance_info_cache_interval = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.997178] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] host = cpu-1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.997433] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.997681] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] initial_disk_allocation_ratio = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.997924] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] initial_ram_allocation_ratio = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.998237] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.998512] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instance_build_timeout = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.998763] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instance_delete_interval = 300 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.999037] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instance_format = [instance: %(uuid)s] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.999283] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instance_name_template = instance-%08x {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.999536] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instance_usage_audit = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 536.999798] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instance_usage_audit_period = month {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.000062] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.000323] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] instances_path = /opt/stack/data/nova/instances {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.000578] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] internal_service_availability_zone = internal {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.000825] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] key = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.001086] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] live_migration_retry_count = 30 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.001340] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_config_append = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.001595] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.001836] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_dir = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.002096] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.002314] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_options = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.002560] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_rotate_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.002809] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_rotate_interval_type = days {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.003076] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] log_rotation_type = none {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.003296] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.003513] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.003769] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.004033] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.004250] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.004498] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] long_rpc_timeout = 1800 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.004748] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] max_concurrent_builds = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.004990] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] max_concurrent_live_migrations = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.005244] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] max_concurrent_snapshots = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.005486] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] max_local_block_devices = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.005731] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] max_logfile_count = 30 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.005983] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] max_logfile_size_mb = 200 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.006243] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] maximum_instance_delete_attempts = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.006493] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] metadata_listen = 0.0.0.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.006745] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] metadata_listen_port = 8775 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.006998] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] metadata_workers = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.007256] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] migrate_max_retries = -1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.007512] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] mkisofs_cmd = genisoimage {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.007807] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] my_block_storage_ip = 10.180.1.21 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.008021] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] my_ip = 10.180.1.21 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.008273] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] network_allocate_retries = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.008534] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.008787] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] osapi_compute_listen = 0.0.0.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.009046] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] osapi_compute_listen_port = 8774 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.009307] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] osapi_compute_unique_server_name_scope = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.009573] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] osapi_compute_workers = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.009819] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] password_length = 12 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.010076] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] periodic_enable = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.010331] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] periodic_fuzzy_delay = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.010583] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] pointer_model = usbtablet {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.010832] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] preallocate_images = none {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.011091] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] publish_errors = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.011302] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] pybasedir = /opt/stack/nova {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.011549] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ram_allocation_ratio = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.011802] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rate_limit_burst = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.012070] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rate_limit_except_level = CRITICAL {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.012323] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rate_limit_interval = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.012573] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] reboot_timeout = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.012822] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] reclaim_instance_interval = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.013072] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] record = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.013324] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] reimage_timeout_per_gb = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.013578] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] report_interval = 120 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.013828] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rescue_timeout = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.014091] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] reserved_host_cpus = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.014343] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] reserved_host_disk_mb = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.014597] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] reserved_host_memory_mb = 512 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.014841] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] reserved_huge_pages = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.015101] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] resize_confirm_window = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.015350] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] resize_fs_using_block_device = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.015599] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] resume_guests_state_on_host_boot = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.015847] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.016105] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rpc_response_timeout = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.016358] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] run_external_periodic_tasks = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.016611] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] running_deleted_instance_action = reap {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.016856] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] running_deleted_instance_poll_interval = 1800 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.017112] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] running_deleted_instance_timeout = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.017361] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler_instance_sync_interval = 120 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.017611] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_down_time = 720 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.017862] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] servicegroup_driver = db {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.018121] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] shelved_offload_time = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.018368] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] shelved_poll_interval = 3600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.018620] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] shutdown_timeout = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.018864] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] source_is_ipv6 = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.019123] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ssl_only = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.019473] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.019722] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] sync_power_state_interval = 600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.019968] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] sync_power_state_pool_size = 1000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.020240] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] syslog_log_facility = LOG_USER {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.020480] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] tempdir = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.020726] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] timeout_nbd = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.020979] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] transport_url = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.021234] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] update_resources_interval = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.021478] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] use_cow_images = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.021727] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] use_eventlog = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.021966] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] use_journal = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.022223] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] use_json = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.022469] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] use_rootwrap_daemon = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.022711] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] use_stderr = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.022954] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] use_syslog = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.023204] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vcpu_pin_set = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.023457] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plugging_is_fatal = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.023710] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plugging_timeout = 300 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.023956] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] virt_mkfs = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.024216] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] volume_usage_poll_interval = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.024463] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] watch_log_file = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.024718] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] web = /usr/share/spice-html5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 537.024988] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_concurrency.disable_process_locking = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.025376] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.025631] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.025884] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.026148] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.026407] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.026657] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.026917] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.auth_strategy = keystone {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.027177] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.compute_link_prefix = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.027440] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.027699] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.dhcp_domain = novalocal {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.027951] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.enable_instance_password = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.028213] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.glance_link_prefix = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.028459] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.028717] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.028965] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.instance_list_per_project_cells = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.029233] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.list_records_by_skipping_down_cells = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.029482] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.local_metadata_per_cell = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.029734] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.max_limit = 1000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.029985] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.metadata_cache_expiration = 15 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.031542] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.neutron_default_tenant_id = default {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.031542] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.use_forwarded_for = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.031542] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.use_neutron_default_nets = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.031542] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.031542] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.031995] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.031995] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.032282] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.vendordata_dynamic_targets = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.032542] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.vendordata_jsonfile_path = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.032811] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.033104] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.backend = dogpile.cache.memcached {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.033390] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.backend_argument = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034706] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.config_prefix = cache.oslo {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034706] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.dead_timeout = 60.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034706] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.debug_cache_backend = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034706] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.enable_retry_client = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034706] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.enable_socket_keepalive = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034706] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.enabled = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034939] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.expiration_time = 600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034939] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.hashclient_retry_attempts = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.034992] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.hashclient_retry_delay = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.035169] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_dead_retry = 300 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.035352] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_password = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.035531] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.035699] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.035865] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_pool_maxsize = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.036042] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.036214] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_sasl_enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.036396] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.036568] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_socket_timeout = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.036739] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.memcache_username = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.036907] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.proxies = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.037082] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.retry_attempts = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.037254] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.retry_delay = 0.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.037527] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.socket_keepalive_count = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.037581] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.socket_keepalive_idle = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.037739] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.socket_keepalive_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.037897] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.tls_allowed_ciphers = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.038064] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.tls_cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.038224] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.tls_certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.038384] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.tls_enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.038542] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cache.tls_keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.038708] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.038879] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.auth_type = password {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.039052] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.039234] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.catalog_info = volumev3::publicURL {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.039411] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.039568] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.039728] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.cross_az_attach = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.039890] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.debug = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.040059] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.endpoint_template = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.040224] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.http_retries = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.040396] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.040548] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.040715] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.os_region_name = RegionOne {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.040876] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.041043] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cinder.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.041218] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.041378] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.cpu_dedicated_set = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.041536] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.cpu_shared_set = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.041700] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.image_type_exclude_list = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.041863] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.042036] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.max_concurrent_disk_ops = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.042204] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.max_disk_devices_to_attach = -1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.042370] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.042540] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.042703] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.resource_provider_association_refresh = 300 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.042865] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.sharing_providers_max_uuids_per_request = 200 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.043034] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.shutdown_retry_interval = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.043217] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.043393] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] conductor.workers = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.043570] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] console.allowed_origins = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.043731] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] console.ssl_ciphers = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.043900] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] console.ssl_minimum_version = default {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.044086] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] consoleauth.token_ttl = 600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.044263] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.044425] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.044591] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.044750] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.connect_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.044909] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.connect_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.045078] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.endpoint_override = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.045245] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.045402] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.045564] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.max_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.045721] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.min_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.045878] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.region_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.046044] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.service_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.046214] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.service_type = accelerator {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.046375] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.046534] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.status_code_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.046688] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.status_code_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.046845] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.047031] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.047196] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] cyborg.version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.047387] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.backend = sqlalchemy {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.047562] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.connection = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.047731] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.connection_debug = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.047901] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.connection_parameters = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.048080] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.connection_recycle_time = 3600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.048252] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.connection_trace = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.048414] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.db_inc_retry_interval = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.048579] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.db_max_retries = 20 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.048741] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.db_max_retry_interval = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.048904] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.db_retry_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.049083] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.max_overflow = 50 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.049247] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.max_pool_size = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.049424] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.max_retries = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.049587] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.049743] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.mysql_wsrep_sync_wait = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.049904] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.pool_timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.050081] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.retry_interval = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.050242] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.slave_connection = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.050411] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.sqlite_synchronous = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.050571] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] database.use_db_reconnect = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.050748] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.backend = sqlalchemy {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.050920] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.connection = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.051099] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.connection_debug = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.051270] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.connection_parameters = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.051432] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.connection_recycle_time = 3600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.051599] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.connection_trace = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.051760] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.db_inc_retry_interval = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.051922] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.db_max_retries = 20 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.052093] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.db_max_retry_interval = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.052255] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.db_retry_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.052425] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.max_overflow = 50 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.052583] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.max_pool_size = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.052748] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.max_retries = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.052910] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.053076] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.053240] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.pool_timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.053406] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.retry_interval = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.053564] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.slave_connection = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.053726] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] api_database.sqlite_synchronous = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.053898] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] devices.enabled_mdev_types = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.054082] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.054249] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ephemeral_storage_encryption.enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.054418] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.054589] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.api_servers = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.054751] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.054910] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.055084] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.055247] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.connect_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.055405] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.connect_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.055571] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.debug = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.055735] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.default_trusted_certificate_ids = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.055899] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.enable_certificate_validation = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.056072] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.enable_rbd_download = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.056235] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.endpoint_override = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.056403] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.056571] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.056726] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.max_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.056882] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.min_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.057053] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.num_retries = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.057223] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.rbd_ceph_conf = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.057395] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.rbd_connect_timeout = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.057561] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.rbd_pool = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.057728] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.rbd_user = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.057886] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.region_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.058052] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.service_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.058224] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.service_type = image {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.058390] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.058552] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.status_code_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.058710] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.status_code_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.058866] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.059054] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.059220] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.verify_glance_signatures = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.059388] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] glance.version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.059552] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] guestfs.debug = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.059719] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.config_drive_cdrom = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.059880] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.config_drive_inject_password = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.060049] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.060214] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.enable_instance_metrics_collection = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.060375] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.enable_remotefx = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.060543] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.instances_path_share = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.060705] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.iscsi_initiator_list = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.060866] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.limit_cpu_features = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.061042] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.061207] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.061374] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.power_state_check_timeframe = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.061535] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.061701] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.061861] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.use_multipath_io = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.062030] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.volume_attach_retry_count = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.062193] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.062350] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.vswitch_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.062515] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.062676] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] mks.enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.063053] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.063246] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] image_cache.manager_interval = 2400 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.063416] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] image_cache.precache_concurrency = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.063587] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] image_cache.remove_unused_base_images = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.063758] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.063926] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.064113] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] image_cache.subdirectory_name = _base {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.064292] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.api_max_retries = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.064457] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.api_retry_interval = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.064620] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.064783] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.auth_type = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.064939] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.065107] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.065273] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.065435] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.conductor_group = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.065596] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.connect_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.065750] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.connect_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.065904] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.endpoint_override = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.066072] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.066235] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.066392] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.max_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.066550] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.min_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.066714] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.peer_list = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.066870] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.region_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.067040] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.serial_console_state_timeout = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.067202] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.service_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.067374] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.service_type = baremetal {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.067530] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.067687] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.status_code_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.067842] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.status_code_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.067999] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.068188] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.068350] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ironic.version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.068529] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.068703] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] key_manager.fixed_key = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.068881] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.069052] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.barbican_api_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.069214] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.barbican_endpoint = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.069397] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.barbican_endpoint_type = public {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.069550] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.barbican_region_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.069708] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.069865] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.070036] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.070199] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.070358] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.070523] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.number_of_retries = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.070683] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.retry_delay = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.070847] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.send_service_user_token = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.071024] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.071179] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.071342] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.verify_ssl = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.071500] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican.verify_ssl_path = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.071668] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.071827] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.auth_type = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.071985] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.072159] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.072324] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.072489] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.072647] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.072809] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.072966] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] barbican_service_user.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.073146] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.approle_role_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.073308] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.approle_secret_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.073468] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.073623] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.073787] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.073949] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.074119] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.074288] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.kv_mountpoint = secret {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.074446] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.kv_path = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.074608] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.kv_version = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.074764] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.namespace = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.074922] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.root_token_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.075094] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.075256] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.ssl_ca_crt_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.075414] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.075576] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.use_ssl = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.075744] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.075910] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.076083] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.auth_type = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.076244] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.076401] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.076565] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.076722] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.connect_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.076878] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.connect_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.077044] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.endpoint_override = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.077210] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.077374] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.077525] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.max_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.077680] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.min_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.077833] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.region_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.077987] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.service_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.078167] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.service_type = identity {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.078330] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.078489] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.status_code_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.078646] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.status_code_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.078802] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.078978] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.079149] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] keystone.version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.079349] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.connection_uri = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.079504] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.cpu_mode = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.079671] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.cpu_model_extra_flags = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.079836] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.cpu_models = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.080008] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.cpu_power_governor_high = performance {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.080179] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.cpu_power_governor_low = powersave {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.080340] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.cpu_power_management = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.080509] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.080667] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.device_detach_attempts = 8 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.080827] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.device_detach_timeout = 20 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.080989] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.disk_cachemodes = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.081161] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.disk_prefix = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.081328] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.enabled_perf_events = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.081493] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.file_backed_memory = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.081658] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.gid_maps = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.081815] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.hw_disk_discard = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.081970] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.hw_machine_type = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.082149] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.images_rbd_ceph_conf = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.082313] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.082482] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.082650] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.images_rbd_glance_store_name = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.082817] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.images_rbd_pool = rbd {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.082985] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.images_type = default {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.083157] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.images_volume_group = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.083321] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.inject_key = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.083485] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.inject_partition = -2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.083650] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.inject_password = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.083810] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.iscsi_iface = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.083973] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.iser_use_multipath = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.084151] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_bandwidth = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.084316] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.084481] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_downtime = 500 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.084649] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.084814] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.084976] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_inbound_addr = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.085150] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.085315] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_permit_post_copy = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.085476] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_scheme = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.085647] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_timeout_action = abort {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.085809] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_tunnelled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.085967] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_uri = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.086140] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.live_migration_with_native_tls = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.086302] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.max_queues = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.086465] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.086625] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.nfs_mount_options = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.086929] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.087113] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.087283] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.num_iser_scan_tries = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.087444] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.num_memory_encrypted_guests = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.087608] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.087772] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.num_pcie_ports = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.087939] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.num_volume_scan_tries = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.088115] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.pmem_namespaces = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.088278] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.quobyte_client_cfg = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.088562] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.088734] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rbd_connect_timeout = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.088900] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.089211] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.089263] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rbd_secret_uuid = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.089494] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rbd_user = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.089543] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.089718] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.remote_filesystem_transport = ssh {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.089914] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rescue_image_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.090021] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rescue_kernel_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.090180] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rescue_ramdisk_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.090350] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.090511] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.rx_queue_size = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.090676] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.smbfs_mount_options = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.090950] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.091137] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.snapshot_compression = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.091303] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.snapshot_image_format = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.091522] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.091694] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.sparse_logical_volumes = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.091859] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.swtpm_enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.092036] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.swtpm_group = tss {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.092208] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.swtpm_user = tss {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.092376] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.sysinfo_serial = unique {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.092536] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.tb_cache_size = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.092695] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.tx_queue_size = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.092858] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.uid_maps = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.093031] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.use_virtio_for_bridges = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.093208] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.virt_type = kvm {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.093379] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.volume_clear = zero {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.093544] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.volume_clear_size = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.093712] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.volume_use_multipath = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.093871] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.vzstorage_cache_path = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.094053] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.094227] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.vzstorage_mount_group = qemu {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.094393] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.vzstorage_mount_opts = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.094562] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.094836] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.095025] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.vzstorage_mount_user = stack {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.095193] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.095366] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.095541] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.auth_type = password {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.095705] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.095866] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.096046] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.096210] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.connect_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.096369] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.connect_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.096539] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.default_floating_pool = public {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.096698] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.endpoint_override = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.096860] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.extension_sync_interval = 600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.097031] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.http_retries = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.097201] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.097361] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.097519] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.max_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.097689] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.097847] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.min_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.098023] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.ovs_bridge = br-int {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.098192] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.physnets = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.098361] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.region_name = RegionOne {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.098533] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.service_metadata_proxy = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.098694] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.service_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.098860] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.service_type = network {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.099031] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.099195] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.status_code_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.099360] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.status_code_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.099515] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.099693] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.099855] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] neutron.version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.100036] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] notifications.bdms_in_notifications = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.100225] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] notifications.default_level = INFO {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.100400] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] notifications.notification_format = unversioned {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.100565] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] notifications.notify_on_state_change = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.100738] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.100914] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] pci.alias = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.101094] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] pci.device_spec = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.101263] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] pci.report_in_placement = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.101435] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.101609] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.auth_type = password {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.101777] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.101935] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.102104] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.102269] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.102427] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.connect_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.102586] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.connect_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.102742] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.default_domain_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.102898] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.default_domain_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.103061] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.domain_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.103218] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.domain_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.103374] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.endpoint_override = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.103534] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.103688] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.103842] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.max_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.103994] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.min_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.104171] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.password = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.104331] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.project_domain_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.104495] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.project_domain_name = Default {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.104661] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.project_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.104832] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.project_name = service {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.105006] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.region_name = RegionOne {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.105172] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.service_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.105340] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.service_type = placement {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.105505] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.105664] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.status_code_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.105821] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.status_code_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.105978] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.system_scope = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.106150] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.106309] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.trust_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.106467] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.user_domain_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.106632] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.user_domain_name = Default {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.106790] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.user_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.106957] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.username = placement {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.107147] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.107309] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] placement.version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.107486] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.cores = 20 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.107653] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.count_usage_from_placement = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.107823] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.107993] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.injected_file_content_bytes = 10240 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.108171] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.injected_file_path_length = 255 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.108336] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.injected_files = 5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.108503] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.instances = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.108667] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.key_pairs = 100 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.108831] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.metadata_items = 128 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.108995] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.ram = 51200 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.109175] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.recheck_quota = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.109362] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.server_group_members = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.109518] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] quota.server_groups = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.109686] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rdp.enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.109997] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.110195] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.110367] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.110535] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.image_metadata_prefilter = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.110697] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.110864] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.max_attempts = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.111038] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.max_placement_results = 1000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.111208] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.111373] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.query_placement_for_image_type_support = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.111539] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.111711] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] scheduler.workers = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.111884] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.112064] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.112246] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.112416] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.112584] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.112746] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.112909] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.113106] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.113275] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.host_subset_size = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.113439] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.113597] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.113766] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.113932] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.isolated_hosts = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.114105] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.isolated_images = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.114270] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.114431] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.114594] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.114755] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.pci_in_placement = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.114915] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.115090] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.115253] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.115416] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.115579] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.115741] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.115903] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.track_instance_changes = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.116088] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.116262] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] metrics.required = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.116424] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] metrics.weight_multiplier = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.116586] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.116747] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] metrics.weight_setting = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.117055] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.117234] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] serial_console.enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.117408] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] serial_console.port_range = 10000:20000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.117578] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.117745] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.117912] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] serial_console.serialproxy_port = 6083 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.118091] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.118266] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.auth_type = password {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.118426] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.118584] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.118744] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.118901] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.119081] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.119253] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.send_service_user_token = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.119421] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.119578] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] service_user.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.119747] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.agent_enabled = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.119910] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.120209] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.120401] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.120571] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.html5proxy_port = 6082 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.120732] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.image_compression = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.120888] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.jpeg_compression = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.121053] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.playback_compression = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.121224] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.server_listen = 127.0.0.1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.121393] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.121553] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.streaming_mode = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.121710] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] spice.zlib_compression = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.121875] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] upgrade_levels.baseapi = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.122045] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] upgrade_levels.cert = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.122215] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] upgrade_levels.compute = auto {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.122374] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] upgrade_levels.conductor = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.122534] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] upgrade_levels.scheduler = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.122699] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.122861] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.auth_type = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.123028] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.123190] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.123355] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.123518] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.123676] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.123837] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.123994] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vendordata_dynamic_auth.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.124180] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.api_retry_count = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.124339] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.ca_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.124510] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.cache_prefix = devstack-image-cache {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.124674] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.cluster_name = testcl1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.124834] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.connection_pool_size = 10 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.124990] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.console_delay_seconds = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.125167] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.datastore_regex = ^datastore.* {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.125368] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.125541] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.host_password = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.125707] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.host_port = 443 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.125873] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.host_username = administrator@vsphere.local {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.126049] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.insecure = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.126213] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.integration_bridge = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.126376] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.maximum_objects = 100 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.126534] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.pbm_default_policy = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.126694] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.pbm_enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.126847] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.pbm_wsdl_location = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.127026] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.127179] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.serial_port_proxy_uri = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.127336] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.serial_port_service_uri = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.127499] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.task_poll_interval = 0.5 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.127667] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.use_linked_clone = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.127832] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.vnc_keymap = en-us {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.127996] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.vnc_port = 5900 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.128173] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vmware.vnc_port_total = 10000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.128357] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.auth_schemes = ['none'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.128529] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.128820] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.129006] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.129186] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.novncproxy_port = 6080 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.129365] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.server_listen = 127.0.0.1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.129538] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.129696] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.vencrypt_ca_certs = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.129850] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.vencrypt_client_cert = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.130009] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vnc.vencrypt_client_key = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.130186] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.130350] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.disable_deep_image_inspection = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.130511] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.130669] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.130828] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.130988] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.disable_rootwrap = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.131160] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.enable_numa_live_migration = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.131320] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.131481] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.131639] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.131797] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.libvirt_disable_apic = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.131954] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.132126] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.132288] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.132450] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.132613] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.132772] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.132930] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.133099] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.133261] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.133424] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.133608] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.133778] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.client_socket_timeout = 900 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.133943] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.default_pool_size = 1000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.134121] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.keep_alive = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.134300] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.max_header_line = 16384 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.134847] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.secure_proxy_ssl_header = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.134847] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.ssl_ca_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.134847] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.ssl_cert_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.135036] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.ssl_key_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.135078] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.tcp_keepidle = 600 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.135243] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.135410] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] zvm.ca_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.135570] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] zvm.cloud_connector_url = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.135850] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.136032] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] zvm.reachable_timeout = 300 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.136221] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.enforce_new_defaults = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.136390] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.enforce_scope = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.136567] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.policy_default_rule = default {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.136747] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.136918] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.policy_file = policy.yaml {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.137097] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.137260] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.137419] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.137577] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.137740] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.137905] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.138085] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.138261] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.connection_string = messaging:// {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.138427] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.enabled = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.138595] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.es_doc_type = notification {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.138754] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.es_scroll_size = 10000 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.138921] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.es_scroll_time = 2m {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.139100] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.filter_error_trace = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.139270] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.hmac_keys = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.139437] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.sentinel_service_name = mymaster {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.139602] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.socket_timeout = 0.1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.139761] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.trace_requests = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.139917] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler.trace_sqlalchemy = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.140112] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler_jaeger.process_tags = {} {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.140275] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler_jaeger.service_name_prefix = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.140439] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] profiler_otlp.service_name_prefix = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.140605] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] remote_debug.host = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.140762] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] remote_debug.port = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.140937] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.141111] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.141276] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.141437] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.141670] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.141751] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.141912] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.142085] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.142251] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.142410] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.142580] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.142748] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.142918] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.143096] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.143266] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.143442] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.143608] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.143770] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.143935] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.144110] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.144275] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.144441] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.144604] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.144767] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.144933] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.145109] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.ssl = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.145283] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.145453] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.145618] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.145788] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.145957] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_rabbit.ssl_version = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.146155] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.146323] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_notifications.retry = -1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.146504] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.146677] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_messaging_notifications.transport_url = **** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.146846] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.auth_section = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.147025] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.auth_type = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.147184] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.cafile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.147345] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.certfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.147508] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.collect_timing = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.147667] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.connect_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.147822] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.connect_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.147978] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.endpoint_id = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.148146] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.endpoint_override = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.148306] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.insecure = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.148462] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.keyfile = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.148618] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.max_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.148773] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.min_version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.148928] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.region_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.149094] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.service_name = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.149252] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.service_type = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.149417] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.split_loggers = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.149577] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.status_code_retries = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.149737] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.status_code_retry_delay = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.149892] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.timeout = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.150059] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.valid_interfaces = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.150220] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_limit.version = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.150387] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_reports.file_event_handler = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.150553] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.150712] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] oslo_reports.log_dir = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.150883] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.151053] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.151217] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.151383] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.151549] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.151707] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.151874] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.152044] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_ovs_privileged.group = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.152207] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.152372] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.152536] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.152695] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] vif_plug_ovs_privileged.user = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.152863] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.flat_interface = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.153050] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.153228] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.153397] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.153567] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.153732] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.153895] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.154063] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.154242] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.154411] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_ovs.isolate_vif = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.154576] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.154742] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.154907] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.155089] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_ovs.ovsdb_interface = native {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.155253] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_vif_ovs.per_port_bridge = False {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.155416] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_brick.lock_path = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.155580] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.155739] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.155908] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] privsep_osbrick.capabilities = [21] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.156074] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] privsep_osbrick.group = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.156235] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] privsep_osbrick.helper_command = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.156398] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.156561] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.156717] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] privsep_osbrick.user = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.156885] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.157049] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] nova_sys_admin.group = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.157209] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] nova_sys_admin.helper_command = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.157372] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.157530] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.157685] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] nova_sys_admin.user = None {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 537.157813] env[67964]: DEBUG oslo_service.service [None req-67ff4cdc-2612-4cbf-b1ad-8b54ea154743 None None] ******************************************************************************** {{(pid=67964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 537.158234] env[67964]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 537.168572] env[67964]: WARNING nova.virt.vmwareapi.driver [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 537.169016] env[67964]: INFO nova.virt.node [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Generated node identity 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 [ 537.169243] env[67964]: INFO nova.virt.node [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Wrote node identity 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 to /opt/stack/data/n-cpu-1/compute_id [ 537.182508] env[67964]: WARNING nova.compute.manager [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Compute nodes ['2c116cee-6c2d-4cdd-b5f2-5697c0d45f41'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 537.215372] env[67964]: INFO nova.compute.manager [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 537.235207] env[67964]: WARNING nova.compute.manager [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 537.235428] env[67964]: DEBUG oslo_concurrency.lockutils [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 537.235638] env[67964]: DEBUG oslo_concurrency.lockutils [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 537.235779] env[67964]: DEBUG oslo_concurrency.lockutils [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 537.235933] env[67964]: DEBUG nova.compute.resource_tracker [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 537.238223] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1117ab99-12ca-4011-ab6a-214c2828ecba {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.245577] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a653c26-097b-45d7-94c3-336f44a1706b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.259053] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9196fa58-beac-4939-a291-51fa5b8d545e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.265129] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47a5d37d-a448-4a20-ba11-80c27b4a59ef {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.294473] env[67964]: DEBUG nova.compute.resource_tracker [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180924MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 537.294561] env[67964]: DEBUG oslo_concurrency.lockutils [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 537.294740] env[67964]: DEBUG oslo_concurrency.lockutils [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 537.305625] env[67964]: WARNING nova.compute.resource_tracker [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] No compute node record for cpu-1:2c116cee-6c2d-4cdd-b5f2-5697c0d45f41: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 could not be found. [ 537.317612] env[67964]: INFO nova.compute.resource_tracker [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 [ 537.368115] env[67964]: DEBUG nova.compute.resource_tracker [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 537.368331] env[67964]: DEBUG nova.compute.resource_tracker [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 537.472266] env[67964]: INFO nova.scheduler.client.report [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] [req-850e81ef-4168-421e-8e30-92d0e0e07a95] Created resource provider record via placement API for resource provider with UUID 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 537.488983] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c4140d4-d8ce-4968-9783-9c3f74bc6e1c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.496609] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afc4f3ca-21ef-4577-823e-33521115f57c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.525660] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e21d176-7a33-4b7a-8a16-3706a10347ee {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.532621] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-848865f6-4d1c-4ca3-a0ad-46dc6ba82d70 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 537.545189] env[67964]: DEBUG nova.compute.provider_tree [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 537.583944] env[67964]: DEBUG nova.scheduler.client.report [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Updated inventory for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:971}} [ 537.584211] env[67964]: DEBUG nova.compute.provider_tree [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Updating resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 generation from 0 to 1 during operation: update_inventory {{(pid=67964) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 537.584360] env[67964]: DEBUG nova.compute.provider_tree [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 537.634180] env[67964]: DEBUG nova.compute.provider_tree [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Updating resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 generation from 1 to 2 during operation: update_traits {{(pid=67964) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 537.651653] env[67964]: DEBUG nova.compute.resource_tracker [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 537.651833] env[67964]: DEBUG oslo_concurrency.lockutils [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 537.651995] env[67964]: DEBUG nova.service [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Creating RPC server for service compute {{(pid=67964) start /opt/stack/nova/nova/service.py:182}} [ 537.665306] env[67964]: DEBUG nova.service [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] Join ServiceGroup membership for this service compute {{(pid=67964) start /opt/stack/nova/nova/service.py:199}} [ 537.665485] env[67964]: DEBUG nova.servicegroup.drivers.db [None req-21fd9a03-8627-4d1d-a853-7aba912c28b8 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67964) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 546.983663] env[67964]: DEBUG dbcounter [-] [67964] Writing DB stats nova_cell0:SELECT=1 {{(pid=67964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 546.984817] env[67964]: DEBUG dbcounter [-] [67964] Writing DB stats nova_cell1:SELECT=1 {{(pid=67964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 559.670102] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 559.680521] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Getting list of instances from cluster (obj){ [ 559.680521] env[67964]: value = "domain-c8" [ 559.680521] env[67964]: _type = "ClusterComputeResource" [ 559.680521] env[67964]: } {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 559.681628] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57f6de1b-756d-4062-9fd7-cb02d5aab233 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.691191] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Got total of 0 instances {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 559.691419] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 559.691736] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Getting list of instances from cluster (obj){ [ 559.691736] env[67964]: value = "domain-c8" [ 559.691736] env[67964]: _type = "ClusterComputeResource" [ 559.691736] env[67964]: } {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 559.692606] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24af1656-fe22-4da8-b137-d0733fa0414e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.700442] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Got total of 0 instances {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 578.038206] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquiring lock "6ebecddf-098f-447f-a350-6644b50f87f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 578.038518] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Lock "6ebecddf-098f-447f-a350-6644b50f87f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 578.061535] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 578.194822] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 578.197296] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 578.197296] env[67964]: INFO nova.compute.claims [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 578.334074] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-150c42b8-86fe-487d-95dc-23dde24ed862 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.346943] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c4816b7-d7ee-4231-9900-1272da7ea240 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.385525] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab3d0520-1adf-4a17-b469-b81161a176db {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.394665] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f6ea921-6962-40ac-81b2-f0b6877914a7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.408701] env[67964]: DEBUG nova.compute.provider_tree [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 578.422547] env[67964]: DEBUG nova.scheduler.client.report [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 578.444707] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 578.445526] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 578.499064] env[67964]: DEBUG nova.compute.utils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 578.501630] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 578.501630] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 578.535130] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 578.627971] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 579.770253] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 579.772057] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 579.772382] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 579.772704] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 579.775537] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 579.775748] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 579.776023] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 579.776215] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 579.776904] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 579.777173] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 579.777423] env[67964]: DEBUG nova.virt.hardware [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 579.778685] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1079b03f-66b7-4a8c-834a-637c7a716750 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.793598] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0370fe7-af78-48d4-b74e-12ec59da38a7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.818244] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f922d80-99c6-49b6-b1a1-d30671fb99d6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.014995] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquiring lock "e55fdbbb-813d-427c-a53f-5be3fbeeb531" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.014995] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Lock "e55fdbbb-813d-427c-a53f-5be3fbeeb531" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.041021] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 580.116087] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.116335] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.118353] env[67964]: INFO nova.compute.claims [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 580.285266] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b14c92-32a4-4242-9bb9-99e8ba8d4e11 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.299179] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c38efc6-b5e7-4b19-8fa0-918a1c9a3ef6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.341168] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf3348e6-a032-4c75-9034-b47c557ad0c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.349992] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3bdb0ba-cea8-4f7b-86fe-20fc258a0b56 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.367168] env[67964]: DEBUG nova.compute.provider_tree [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 580.380286] env[67964]: DEBUG nova.policy [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '993d71f18f674c308e6137b3dddfbd5a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5115077efe5148fe9ce1a6f54b0f0178', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 580.383689] env[67964]: DEBUG nova.scheduler.client.report [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 580.404398] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.405070] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 580.455734] env[67964]: DEBUG nova.compute.utils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 580.460111] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 580.460731] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 580.472261] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 580.566726] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 580.630311] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 580.631737] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 580.631967] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 580.632239] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 580.632422] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 580.632613] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 580.632981] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 580.633964] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 580.634218] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 580.634439] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 580.634717] env[67964]: DEBUG nova.virt.hardware [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 580.636467] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f450b466-fa81-4828-8812-e8e2a0f7132f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.654526] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-548a9130-408f-4eb5-a74e-beadfe0796bd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.004480] env[67964]: DEBUG nova.policy [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47d0630b404245328afba598a66c912f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4843a8f271fc438fb1296f2ae501f703', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 582.426135] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "93509103-8c02-420d-bcaa-c2cf0847b1f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.426653] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.442393] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 582.528746] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.528746] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.531815] env[67964]: INFO nova.compute.claims [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 582.547799] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Successfully created port: 41b8e312-a79a-49c8-98fd-dca83de83b7c {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 582.688963] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0b14b5-779b-441d-8035-825106091800 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.704524] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94aa5721-33f6-428d-a72b-d67d781ef341 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.747744] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b0ba2ea-7b67-4fc8-a637-2e2e8f875b22 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.756580] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ddeb156-d749-4291-853e-e0ecd3ef1a51 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.771310] env[67964]: DEBUG nova.compute.provider_tree [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 582.781665] env[67964]: DEBUG nova.scheduler.client.report [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 582.805378] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.805886] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 582.845545] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "371aeb17-ad59-4a01-88f7-466dfee8d293" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.847148] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.861726] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 582.866624] env[67964]: DEBUG nova.compute.utils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 582.871567] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 582.871567] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 582.902401] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 582.963554] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.963830] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.965338] env[67964]: INFO nova.compute.claims [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 583.010843] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 583.043532] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 583.043532] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 583.043532] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 583.043860] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 583.044133] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 583.044382] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 583.046018] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 583.046018] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 583.046018] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 583.046018] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 583.046018] env[67964]: DEBUG nova.virt.hardware [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 583.047175] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9124ad8-780a-4b9f-b43a-874b66b830fb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.062363] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-818a23db-c148-4480-835a-9f6d4767402d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.120207] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdad6faa-6c5b-48ae-81af-7fc1724b3689 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.128345] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec15e387-4df7-46d9-8ab9-89b688cc3484 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.161064] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e8bcb7b-f259-48dc-9bfc-ae002f7d8b01 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.169882] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c612b083-b96b-4206-aa83-91de12abf949 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.189061] env[67964]: DEBUG nova.compute.provider_tree [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 583.202632] env[67964]: DEBUG nova.scheduler.client.report [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 583.221090] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 583.221552] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 583.264894] env[67964]: DEBUG nova.compute.utils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 583.266329] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 583.266329] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 583.279827] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 583.359061] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 583.380740] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 583.382023] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 583.382023] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 583.382023] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 583.382023] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 583.382023] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 583.382436] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 583.382722] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 583.383026] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 583.383316] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 583.386033] env[67964]: DEBUG nova.virt.hardware [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 583.386033] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9689b89a-3719-4377-a906-43b030447754 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.394383] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9d8b185-a01b-44b9-8d1e-653171269648 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.505231] env[67964]: DEBUG nova.policy [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0483a7b0d9664b558abcd9d5be757f0f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '51a75cd8fdaa47f8b37e6c78769a63f8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 583.814085] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "82096302-bbdd-49b4-bd19-bdf75343e03a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.814252] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "82096302-bbdd-49b4-bd19-bdf75343e03a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.824777] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 583.895974] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.896242] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.897868] env[67964]: INFO nova.compute.claims [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 583.900739] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquiring lock "bd297ef0-fa45-43c1-ab4e-14bcce806b35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.900907] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Lock "bd297ef0-fa45-43c1-ab4e-14bcce806b35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.916014] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 583.981245] env[67964]: DEBUG nova.policy [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ae69428aa9246fd9ca5169a669c9f33', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b0971e98f514aafb057e5c603c2137a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 583.989196] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 584.140020] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59919c7d-6779-4a7a-ba02-920db48b213e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.148356] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25016889-ff6c-425e-93c6-aa5c2c142d9f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.180413] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68cbd1af-c7aa-444f-95e0-6c17d69de140 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.188770] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db57a83e-1ada-4b3b-8968-f97223142d83 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.202968] env[67964]: DEBUG nova.compute.provider_tree [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 584.214180] env[67964]: DEBUG nova.scheduler.client.report [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 584.218202] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Successfully updated port: 41b8e312-a79a-49c8-98fd-dca83de83b7c {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 584.238345] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquiring lock "refresh_cache-6ebecddf-098f-447f-a350-6644b50f87f7" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 584.238457] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquired lock "refresh_cache-6ebecddf-098f-447f-a350-6644b50f87f7" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 584.239512] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 584.244674] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 584.245172] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 584.251017] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.258s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 584.251017] env[67964]: INFO nova.compute.claims [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 584.307810] env[67964]: DEBUG nova.compute.utils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 584.312968] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 584.316137] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 584.316137] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 584.322922] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Successfully created port: 533485e1-3b77-4d77-af0f-65077a2275fe {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 584.325383] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 584.438431] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 584.478915] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 584.479184] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 584.479333] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 584.479595] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 584.479687] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 584.479820] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 584.480308] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 584.480308] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 584.480445] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 584.480529] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 584.480689] env[67964]: DEBUG nova.virt.hardware [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 584.481586] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01541952-2359-4430-8954-75595ef41fd1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.490460] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47110ba3-43ef-44b2-99d8-b281e842eb02 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.503319] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae1fe03-59ab-4225-a3a1-cba979812bd3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.530171] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acdd594d-d8d5-4e7f-ba08-71862d2241c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.565027] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f711f136-1352-4e66-9469-6d37412948d7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.572732] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dadce27-5a45-4d27-b447-3aaec38f6c34 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.588245] env[67964]: DEBUG nova.compute.provider_tree [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 584.597045] env[67964]: DEBUG nova.scheduler.client.report [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 584.607510] env[67964]: DEBUG nova.policy [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a763e8aff6184f72b8f07826702fe981', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ece3f919eb54985b4ab3bf0a9362717', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 584.626438] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.379s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 584.626997] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 584.703966] env[67964]: DEBUG nova.compute.utils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 584.709076] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 584.709199] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 584.728348] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 584.865020] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 584.923303] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 584.923754] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 584.923951] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 584.924131] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 584.924275] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 584.924420] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 584.924622] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 584.924919] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 584.924991] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 584.925099] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 584.925453] env[67964]: DEBUG nova.virt.hardware [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 584.926388] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67b26a24-4b83-441a-a626-974f461e3d8d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.938959] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70cf922f-287f-4243-a184-b9983f6a07ef {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.171191] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Updating instance_info_cache with network_info: [{"id": "41b8e312-a79a-49c8-98fd-dca83de83b7c", "address": "fa:16:3e:2d:4f:c7", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap41b8e312-a7", "ovs_interfaceid": "41b8e312-a79a-49c8-98fd-dca83de83b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 585.190503] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Releasing lock "refresh_cache-6ebecddf-098f-447f-a350-6644b50f87f7" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 585.192045] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Instance network_info: |[{"id": "41b8e312-a79a-49c8-98fd-dca83de83b7c", "address": "fa:16:3e:2d:4f:c7", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap41b8e312-a7", "ovs_interfaceid": "41b8e312-a79a-49c8-98fd-dca83de83b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 585.192827] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2d:4f:c7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '41b8e312-a79a-49c8-98fd-dca83de83b7c', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 585.208955] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.211504] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-65e896c6-9aef-4299-9ca2-e6d7ee1dc955 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.215733] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.215733] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.236282] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Created folder: OpenStack in parent group-v4. [ 585.236282] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Creating folder: Project (5115077efe5148fe9ce1a6f54b0f0178). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.236282] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0fba3640-8c59-47b3-8255-886f4f327566 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.242924] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 585.247255] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Created folder: Project (5115077efe5148fe9ce1a6f54b0f0178) in parent group-v690366. [ 585.247429] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Creating folder: Instances. Parent ref: group-v690367. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.247897] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-016594ac-bd15-4e65-bb7c-45bfe9e86dcf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.258209] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Created folder: Instances in parent group-v690367. [ 585.258469] env[67964]: DEBUG oslo.service.loopingcall [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 585.258652] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 585.258850] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b308309f-2cb5-4245-bf74-68a01b8cc670 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.291834] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 585.291834] env[67964]: value = "task-3456677" [ 585.291834] env[67964]: _type = "Task" [ 585.291834] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 585.303332] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456677, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 585.321230] env[67964]: DEBUG nova.policy [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c825459067614f33a86d1bdb8369a027', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bcf2714ed1b549559bbddad438e88840', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 585.327241] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.327241] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.333958] env[67964]: INFO nova.compute.claims [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 585.533277] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ff77eca-a7e9-4694-a6fe-583c30314137 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.543592] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e71a8cde-8904-418b-ac35-edb0ba60c96b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.585188] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c89f950-82d2-4ff6-a0d6-00cd31e19cb4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.593802] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-690f0520-32a1-4cca-8e7e-303534e5b671 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.613786] env[67964]: DEBUG nova.compute.provider_tree [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 585.624398] env[67964]: DEBUG nova.scheduler.client.report [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 585.649801] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 585.650614] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 585.720448] env[67964]: DEBUG nova.compute.utils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 585.721856] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 585.721948] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 585.741021] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 585.802539] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456677, 'name': CreateVM_Task, 'duration_secs': 0.421947} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 585.803567] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 585.847466] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 585.851277] env[67964]: DEBUG oslo_vmware.service [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ac71081-7830-4392-a7e8-1ba7c2a323db {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.858418] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.860713] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.860713] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 585.860713] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-321492d6-b1da-4310-ba43-8da12f072d30 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.866021] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Waiting for the task: (returnval){ [ 585.866021] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5247aa02-0d1e-77bc-0c30-744135254d01" [ 585.866021] env[67964]: _type = "Task" [ 585.866021] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 585.874450] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5247aa02-0d1e-77bc-0c30-744135254d01, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 585.905254] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 585.905728] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 585.905989] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 585.906292] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 585.906573] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 585.906829] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 585.907211] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 585.907580] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 585.907859] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 585.908221] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 585.908705] env[67964]: DEBUG nova.virt.hardware [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 585.911807] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-550a002b-fada-4d53-9cea-7686ef0d9344 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.919310] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a02905ca-f93c-4a12-9ead-4ab86da8d79c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.175078] env[67964]: DEBUG nova.policy [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95dac186edc14f84932739eac5e70df3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5aedeb9a846e4437a87d12ac678ab604', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 586.380105] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.380323] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 586.380577] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.380977] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 586.381448] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 586.381700] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-54842580-947c-4bd2-ad57-167f2c0ff441 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.399797] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 586.400161] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 586.401258] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53dc25f5-f2f9-4793-92b5-2721cac0a06b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.408441] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3653a7f3-7b1f-4c35-b433-5cc437db3aa9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.413829] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Waiting for the task: (returnval){ [ 586.413829] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52aa122c-78b0-d795-69ba-bb25932f3f44" [ 586.413829] env[67964]: _type = "Task" [ 586.413829] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 586.424265] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52aa122c-78b0-d795-69ba-bb25932f3f44, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 586.926509] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 586.926802] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Creating directory with path [datastore1] vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 586.927057] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-04ff3d7d-d48e-4a71-addc-c86574245d45 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.946600] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Successfully created port: 89d0f60e-26f7-4d09-9ae1-f3dd062ffee7 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 586.950392] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Created directory with path [datastore1] vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 586.950596] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Fetch image to [datastore1] vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 586.950763] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 586.951653] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9ab17b6-878e-41eb-93c2-92587f94c0f5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.959702] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ddead4c-c99a-40f1-b303-14cd34aec103 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.970091] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-437684e2-b501-4816-8c99-1d04856a31cf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.010512] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23bf2c17-654d-4ca9-b387-2a691c77a90b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.017498] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d1fe1443-ce00-4578-89bb-7e0879d8e041 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.046602] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 587.050494] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Successfully created port: 8fb15d52-7eb3-4a14-9f66-f92e1c2e550a {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 587.106959] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Successfully created port: c17b4cfe-92b5-45cc-9593-bfdc74a7308e {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 587.132109] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 587.199669] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 587.199883] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 587.213927] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Successfully created port: 8862b639-9a44-49d0-946a-7752f0f8197c {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 588.366430] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "180338df-2738-4eeb-8610-cb130d04f6d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 588.366778] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "180338df-2738-4eeb-8610-cb130d04f6d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 588.385872] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 588.449253] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 588.449534] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 588.454971] env[67964]: INFO nova.compute.claims [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 588.675127] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Successfully created port: 70e4e785-598e-48d0-9ffe-e31afffdb9d8 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 588.743110] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a02d776f-1b8c-4837-8312-21d1c06aea35 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 588.756288] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de784188-11d5-4144-be21-d13624ed4b58 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 588.792884] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-797fd660-d5ec-4615-a40b-40acd1493add {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 588.801141] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6582212e-2bbb-40cc-9c9f-4c70b769a33e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 588.818574] env[67964]: DEBUG nova.compute.provider_tree [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 588.833269] env[67964]: DEBUG nova.scheduler.client.report [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 588.855314] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.406s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 588.856133] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 588.920132] env[67964]: DEBUG nova.compute.utils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 588.921758] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 588.921965] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 588.969496] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 589.075338] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 589.116845] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 589.117112] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 589.117273] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 589.117605] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 589.117605] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 589.117772] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 589.117948] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 589.118124] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 589.118447] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 589.118447] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 589.118606] env[67964]: DEBUG nova.virt.hardware [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 589.119750] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b00e52e-bc7e-44c5-8f26-1afb09dbd7f8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.127443] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa45b52-1c86-4da7-a7b7-af0f68684c28 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.212636] env[67964]: DEBUG nova.compute.manager [req-5b119b53-c4c5-4e68-bf7c-b58b5015c1d2 req-f5e176fa-4cdb-4d64-8f2b-cb9f0a096a66 service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Received event network-vif-plugged-41b8e312-a79a-49c8-98fd-dca83de83b7c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 589.212831] env[67964]: DEBUG oslo_concurrency.lockutils [req-5b119b53-c4c5-4e68-bf7c-b58b5015c1d2 req-f5e176fa-4cdb-4d64-8f2b-cb9f0a096a66 service nova] Acquiring lock "6ebecddf-098f-447f-a350-6644b50f87f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.213040] env[67964]: DEBUG oslo_concurrency.lockutils [req-5b119b53-c4c5-4e68-bf7c-b58b5015c1d2 req-f5e176fa-4cdb-4d64-8f2b-cb9f0a096a66 service nova] Lock "6ebecddf-098f-447f-a350-6644b50f87f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.213634] env[67964]: DEBUG oslo_concurrency.lockutils [req-5b119b53-c4c5-4e68-bf7c-b58b5015c1d2 req-f5e176fa-4cdb-4d64-8f2b-cb9f0a096a66 service nova] Lock "6ebecddf-098f-447f-a350-6644b50f87f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 589.213821] env[67964]: DEBUG nova.compute.manager [req-5b119b53-c4c5-4e68-bf7c-b58b5015c1d2 req-f5e176fa-4cdb-4d64-8f2b-cb9f0a096a66 service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] No waiting events found dispatching network-vif-plugged-41b8e312-a79a-49c8-98fd-dca83de83b7c {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 589.213983] env[67964]: WARNING nova.compute.manager [req-5b119b53-c4c5-4e68-bf7c-b58b5015c1d2 req-f5e176fa-4cdb-4d64-8f2b-cb9f0a096a66 service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Received unexpected event network-vif-plugged-41b8e312-a79a-49c8-98fd-dca83de83b7c for instance with vm_state building and task_state spawning. [ 589.294742] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "8b261c6e-741c-4d6c-9567-566af85cd68f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.295787] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.305724] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 589.365636] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.365901] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.367714] env[67964]: INFO nova.compute.claims [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 589.391705] env[67964]: DEBUG nova.policy [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea4aa5c370dc404686e520fc9b18ec77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b44c88280d149ddacbbde44b468049e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 589.593196] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c303b211-00ed-40b7-b66f-ac08d3447bf5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.603092] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a8bacd5-174f-4a4a-9199-1ed105bdfd8a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.634503] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bd961ad-a8e7-49ee-8a6f-3eb1ad368930 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.643556] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd0cb2da-f549-43d0-802d-32fc586facbb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.663088] env[67964]: DEBUG nova.compute.provider_tree [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 589.676317] env[67964]: DEBUG nova.scheduler.client.report [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 589.683496] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Successfully updated port: 533485e1-3b77-4d77-af0f-65077a2275fe {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 589.707457] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquiring lock "refresh_cache-e55fdbbb-813d-427c-a53f-5be3fbeeb531" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 589.707619] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquired lock "refresh_cache-e55fdbbb-813d-427c-a53f-5be3fbeeb531" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 589.707740] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 589.711402] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 589.711402] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 589.777032] env[67964]: DEBUG nova.compute.utils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 589.778522] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 589.778799] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 589.793175] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 589.909682] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 589.944821] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 589.983656] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 589.983656] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 589.983656] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 589.983822] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 589.988335] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 589.988335] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 589.988335] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 589.988335] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 589.988335] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 589.988494] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 589.988494] env[67964]: DEBUG nova.virt.hardware [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 589.988494] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66ef98ad-4b8d-42c2-b81c-bef2062b33ab {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.000323] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e29fe46f-05d7-44b2-8642-ec5c916021b3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.107693] env[67964]: DEBUG nova.policy [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bc589955a3704238bddf1b6ebd1340bb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28708549a6d54cabb9321784a134305a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 591.220403] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Successfully updated port: 89d0f60e-26f7-4d09-9ae1-f3dd062ffee7 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 591.244253] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "refresh_cache-82096302-bbdd-49b4-bd19-bdf75343e03a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 591.244253] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired lock "refresh_cache-82096302-bbdd-49b4-bd19-bdf75343e03a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 591.244253] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 591.319764] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Successfully created port: e76f8188-1437-4e64-b9ad-e21ecb2951fc {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 591.415985] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Updating instance_info_cache with network_info: [{"id": "533485e1-3b77-4d77-af0f-65077a2275fe", "address": "fa:16:3e:2f:f3:b1", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap533485e1-3b", "ovs_interfaceid": "533485e1-3b77-4d77-af0f-65077a2275fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.433044] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Releasing lock "refresh_cache-e55fdbbb-813d-427c-a53f-5be3fbeeb531" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 591.433286] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Instance network_info: |[{"id": "533485e1-3b77-4d77-af0f-65077a2275fe", "address": "fa:16:3e:2f:f3:b1", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap533485e1-3b", "ovs_interfaceid": "533485e1-3b77-4d77-af0f-65077a2275fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 591.433536] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2f:f3:b1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '533485e1-3b77-4d77-af0f-65077a2275fe', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 591.446620] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Creating folder: Project (4843a8f271fc438fb1296f2ae501f703). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.447737] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3f465375-6550-4b26-9b6f-cd5e1b8ca40b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.461022] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Created folder: Project (4843a8f271fc438fb1296f2ae501f703) in parent group-v690366. [ 591.461214] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Creating folder: Instances. Parent ref: group-v690370. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.461442] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b4338316-a363-4593-aa1d-6d8799ddfed4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.464819] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 591.475141] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Created folder: Instances in parent group-v690370. [ 591.475343] env[67964]: DEBUG oslo.service.loopingcall [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 591.475617] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 591.476098] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-faecce84-4415-46c3-a0b8-01416ba8c2fb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.501089] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 591.501089] env[67964]: value = "task-3456680" [ 591.501089] env[67964]: _type = "Task" [ 591.501089] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 591.509609] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456680, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 591.754541] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Successfully updated port: 8862b639-9a44-49d0-946a-7752f0f8197c {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 591.768894] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquiring lock "refresh_cache-bd297ef0-fa45-43c1-ab4e-14bcce806b35" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 591.769233] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquired lock "refresh_cache-bd297ef0-fa45-43c1-ab4e-14bcce806b35" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 591.769233] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 591.843548] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "9c586d33-c563-45c7-8c54-1638a78a669c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 591.844959] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "9c586d33-c563-45c7-8c54-1638a78a669c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 591.872333] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 591.953396] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 591.953687] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 591.955171] env[67964]: INFO nova.compute.claims [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 592.020534] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456680, 'name': CreateVM_Task, 'duration_secs': 0.33271} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 592.020534] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 592.020534] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 592.020534] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 592.021290] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 592.022891] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-23fd7fcf-6962-4c7b-90ed-67f0a1fc7722 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.033341] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Waiting for the task: (returnval){ [ 592.033341] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52508fe6-9720-e4b7-96d1-af2cd1b8db1c" [ 592.033341] env[67964]: _type = "Task" [ 592.033341] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 592.043498] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52508fe6-9720-e4b7-96d1-af2cd1b8db1c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 592.130812] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 592.204529] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d02d606b-1d74-46fd-9044-3c1f37ca1d20 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.215478] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcdfc9a7-e900-40eb-9c11-88b36bc82d0b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.251242] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1ce9890-8d9d-4cb5-b368-04c2cdebb786 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.259965] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-121cd948-4348-4082-af41-69791a8220a1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.275890] env[67964]: DEBUG nova.compute.provider_tree [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 592.288398] env[67964]: DEBUG nova.scheduler.client.report [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 592.304554] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 592.305065] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 592.347816] env[67964]: DEBUG nova.compute.utils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 592.348701] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 592.348781] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 592.359177] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 592.450658] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 592.461439] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Successfully updated port: c17b4cfe-92b5-45cc-9593-bfdc74a7308e {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 592.487125] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 592.487125] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 592.487125] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 592.487270] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 592.487270] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 592.487270] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 592.487270] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 592.487270] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 592.487465] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 592.487465] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 592.487465] env[67964]: DEBUG nova.virt.hardware [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 592.493585] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc23dcbb-3c8d-42b8-b516-7a23d89be5f2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.499905] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "refresh_cache-93509103-8c02-420d-bcaa-c2cf0847b1f0" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 592.500050] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquired lock "refresh_cache-93509103-8c02-420d-bcaa-c2cf0847b1f0" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 592.500200] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 592.510848] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e95452e-de73-4ad3-9b7a-d65edad41e21 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.065242] env[67964]: DEBUG nova.policy [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9488607deb214d8b849de87c817fd9ea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '503181247f834d34a1e788771dfcab0a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 593.066955] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Successfully updated port: 8fb15d52-7eb3-4a14-9f66-f92e1c2e550a {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 593.068382] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 593.070789] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Successfully created port: 1d2d7bc9-3c84-43c3-b627-c75db5dd3256 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 593.084561] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 593.084848] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 593.084975] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 593.085443] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "refresh_cache-371aeb17-ad59-4a01-88f7-466dfee8d293" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 593.085518] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquired lock "refresh_cache-371aeb17-ad59-4a01-88f7-466dfee8d293" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 593.085652] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 593.422953] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 593.487954] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Updating instance_info_cache with network_info: [{"id": "89d0f60e-26f7-4d09-9ae1-f3dd062ffee7", "address": "fa:16:3e:b4:61:90", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89d0f60e-26", "ovs_interfaceid": "89d0f60e-26f7-4d09-9ae1-f3dd062ffee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 593.507836] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Releasing lock "refresh_cache-82096302-bbdd-49b4-bd19-bdf75343e03a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 593.508181] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Instance network_info: |[{"id": "89d0f60e-26f7-4d09-9ae1-f3dd062ffee7", "address": "fa:16:3e:b4:61:90", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89d0f60e-26", "ovs_interfaceid": "89d0f60e-26f7-4d09-9ae1-f3dd062ffee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 593.508606] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:61:90', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '89d0f60e-26f7-4d09-9ae1-f3dd062ffee7', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 593.517470] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Creating folder: Project (2ece3f919eb54985b4ab3bf0a9362717). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.520554] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4223770c-02ab-41ea-94f1-ee77a8fe2a29 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.533068] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Created folder: Project (2ece3f919eb54985b4ab3bf0a9362717) in parent group-v690366. [ 593.533348] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Creating folder: Instances. Parent ref: group-v690373. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.533526] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-499aeb78-7e99-42c0-bc39-4a68896950ba {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.546356] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Created folder: Instances in parent group-v690373. [ 593.546356] env[67964]: DEBUG oslo.service.loopingcall [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 593.550892] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 593.550892] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a2a07c03-11e2-4aa5-8cdf-926f3d446d90 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.573411] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Successfully updated port: 70e4e785-598e-48d0-9ffe-e31afffdb9d8 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 593.576025] env[67964]: DEBUG nova.compute.manager [req-9c54210e-dbdd-4581-9ad9-d703d837ca04 req-6be0c07d-5752-4dea-afb5-4fb562e884f3 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Received event network-vif-plugged-533485e1-3b77-4d77-af0f-65077a2275fe {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 593.576102] env[67964]: DEBUG oslo_concurrency.lockutils [req-9c54210e-dbdd-4581-9ad9-d703d837ca04 req-6be0c07d-5752-4dea-afb5-4fb562e884f3 service nova] Acquiring lock "e55fdbbb-813d-427c-a53f-5be3fbeeb531-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 593.576475] env[67964]: DEBUG oslo_concurrency.lockutils [req-9c54210e-dbdd-4581-9ad9-d703d837ca04 req-6be0c07d-5752-4dea-afb5-4fb562e884f3 service nova] Lock "e55fdbbb-813d-427c-a53f-5be3fbeeb531-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 593.576726] env[67964]: DEBUG oslo_concurrency.lockutils [req-9c54210e-dbdd-4581-9ad9-d703d837ca04 req-6be0c07d-5752-4dea-afb5-4fb562e884f3 service nova] Lock "e55fdbbb-813d-427c-a53f-5be3fbeeb531-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 593.576963] env[67964]: DEBUG nova.compute.manager [req-9c54210e-dbdd-4581-9ad9-d703d837ca04 req-6be0c07d-5752-4dea-afb5-4fb562e884f3 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] No waiting events found dispatching network-vif-plugged-533485e1-3b77-4d77-af0f-65077a2275fe {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 593.577195] env[67964]: WARNING nova.compute.manager [req-9c54210e-dbdd-4581-9ad9-d703d837ca04 req-6be0c07d-5752-4dea-afb5-4fb562e884f3 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Received unexpected event network-vif-plugged-533485e1-3b77-4d77-af0f-65077a2275fe for instance with vm_state building and task_state spawning. [ 593.583048] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 593.583048] env[67964]: value = "task-3456683" [ 593.583048] env[67964]: _type = "Task" [ 593.583048] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 593.591896] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "refresh_cache-8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 593.591951] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquired lock "refresh_cache-8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 593.592157] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 593.599312] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456683, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 593.743607] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 593.767284] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Updating instance_info_cache with network_info: [{"id": "8862b639-9a44-49d0-946a-7752f0f8197c", "address": "fa:16:3e:cf:0a:70", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.244", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8862b639-9a", "ovs_interfaceid": "8862b639-9a44-49d0-946a-7752f0f8197c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 593.782329] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Releasing lock "refresh_cache-bd297ef0-fa45-43c1-ab4e-14bcce806b35" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 593.782638] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Instance network_info: |[{"id": "8862b639-9a44-49d0-946a-7752f0f8197c", "address": "fa:16:3e:cf:0a:70", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.244", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8862b639-9a", "ovs_interfaceid": "8862b639-9a44-49d0-946a-7752f0f8197c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 593.785052] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cf:0a:70', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8862b639-9a44-49d0-946a-7752f0f8197c', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 593.793235] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Creating folder: Project (bcf2714ed1b549559bbddad438e88840). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.798455] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-79f0a457-0356-46b8-a685-a4fa0a54484d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.812249] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.812646] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Created folder: Project (bcf2714ed1b549559bbddad438e88840) in parent group-v690366. [ 593.815406] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Creating folder: Instances. Parent ref: group-v690376. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.815406] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.815406] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 593.815406] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 593.816376] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6c5d0472-4707-439f-875b-79ae4d152f3c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.829192] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Created folder: Instances in parent group-v690376. [ 593.830096] env[67964]: DEBUG oslo.service.loopingcall [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 593.830096] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 593.830096] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0fb4b005-40af-4475-8fbb-567ea339e81f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.868607] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.870181] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.870401] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.870654] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.870654] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.870814] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.870935] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.871064] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.871180] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.871294] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 593.871410] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 593.872594] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.873239] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.873239] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.873239] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.873429] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.873620] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.873762] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 593.873896] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 593.880790] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 593.880790] env[67964]: value = "task-3456686" [ 593.880790] env[67964]: _type = "Task" [ 593.880790] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 593.897024] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456686, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 593.898186] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 593.899125] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 593.899125] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 593.899125] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 593.900194] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e7767a-d0d4-4614-8802-c461cbc7a3d7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.916492] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da845607-7415-478c-b04b-3a8c938c3cf4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.937654] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afdab0f5-f570-4704-8931-67e4b329599e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.947412] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6458f922-d033-49de-862f-5a4eaa85efe7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.990250] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180917MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 593.990401] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 593.990598] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.102259] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6ebecddf-098f-447f-a350-6644b50f87f7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.102443] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance e55fdbbb-813d-427c-a53f-5be3fbeeb531 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.102570] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.102696] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 371aeb17-ad59-4a01-88f7-466dfee8d293 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.102810] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 82096302-bbdd-49b4-bd19-bdf75343e03a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.102923] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bd297ef0-fa45-43c1-ab4e-14bcce806b35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.103053] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.103166] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 180338df-2738-4eeb-8610-cb130d04f6d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.103274] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.103381] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 594.103568] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 594.103733] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 594.106839] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456683, 'name': CreateVM_Task} progress is 99%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 594.302081] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-145851d6-bd23-4c3d-9cb3-c94de1d94346 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.311371] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87b06d46-80a8-4486-9ebf-d73aa6004ee9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.360377] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6a17b66-b420-47c1-93eb-6ce3fff5ff23 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.368845] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee60812d-a6f4-4f35-bb61-4c580678076f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.383242] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 594.392781] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456686, 'name': CreateVM_Task, 'duration_secs': 0.356485} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 594.393792] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 594.394562] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 594.399973] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.400151] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 594.403202] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 594.403202] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f7ca7e3-06a2-4242-849e-f0d6216fa889 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.406815] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Waiting for the task: (returnval){ [ 594.406815] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52eab80f-88db-3d52-5089-0341c2dc5238" [ 594.406815] env[67964]: _type = "Task" [ 594.406815] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 594.413471] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 594.414157] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.423s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 594.417853] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52eab80f-88db-3d52-5089-0341c2dc5238, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 594.455479] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Updating instance_info_cache with network_info: [{"id": "70e4e785-598e-48d0-9ffe-e31afffdb9d8", "address": "fa:16:3e:6b:0c:bf", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.199", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap70e4e785-59", "ovs_interfaceid": "70e4e785-598e-48d0-9ffe-e31afffdb9d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 594.469104] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Releasing lock "refresh_cache-8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 594.469775] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Instance network_info: |[{"id": "70e4e785-598e-48d0-9ffe-e31afffdb9d8", "address": "fa:16:3e:6b:0c:bf", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.199", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap70e4e785-59", "ovs_interfaceid": "70e4e785-598e-48d0-9ffe-e31afffdb9d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 594.470311] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6b:0c:bf', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '70e4e785-598e-48d0-9ffe-e31afffdb9d8', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 594.479857] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Creating folder: Project (5aedeb9a846e4437a87d12ac678ab604). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 594.480462] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3cc15363-5b91-4a8f-a52c-46b5b181f44a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.493035] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Created folder: Project (5aedeb9a846e4437a87d12ac678ab604) in parent group-v690366. [ 594.493035] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Creating folder: Instances. Parent ref: group-v690379. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 594.493885] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-88df36b1-45ca-4858-acae-952b120a997a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.504699] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Created folder: Instances in parent group-v690379. [ 594.505020] env[67964]: DEBUG oslo.service.loopingcall [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 594.505160] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 594.505350] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c79f68ff-5ad2-4110-85df-2704febf5754 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.526676] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 594.526676] env[67964]: value = "task-3456689" [ 594.526676] env[67964]: _type = "Task" [ 594.526676] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 594.534607] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456689, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 594.593260] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456683, 'name': CreateVM_Task} progress is 99%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 594.714778] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Received event network-changed-41b8e312-a79a-49c8-98fd-dca83de83b7c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 594.714971] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Refreshing instance network info cache due to event network-changed-41b8e312-a79a-49c8-98fd-dca83de83b7c. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 594.717121] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Acquiring lock "refresh_cache-6ebecddf-098f-447f-a350-6644b50f87f7" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.717121] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Acquired lock "refresh_cache-6ebecddf-098f-447f-a350-6644b50f87f7" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 594.720156] env[67964]: DEBUG nova.network.neutron [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Refreshing network info cache for port 41b8e312-a79a-49c8-98fd-dca83de83b7c {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 594.766672] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Successfully updated port: e76f8188-1437-4e64-b9ad-e21ecb2951fc {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 594.785387] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "refresh_cache-180338df-2738-4eeb-8610-cb130d04f6d2" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.785528] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquired lock "refresh_cache-180338df-2738-4eeb-8610-cb130d04f6d2" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 594.786940] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 594.920876] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 594.921155] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 594.921361] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.040204] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456689, 'name': CreateVM_Task} progress is 99%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.068423] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Updating instance_info_cache with network_info: [{"id": "c17b4cfe-92b5-45cc-9593-bfdc74a7308e", "address": "fa:16:3e:e4:27:19", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.87", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc17b4cfe-92", "ovs_interfaceid": "c17b4cfe-92b5-45cc-9593-bfdc74a7308e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.092326] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Releasing lock "refresh_cache-93509103-8c02-420d-bcaa-c2cf0847b1f0" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 595.093244] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Instance network_info: |[{"id": "c17b4cfe-92b5-45cc-9593-bfdc74a7308e", "address": "fa:16:3e:e4:27:19", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.87", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc17b4cfe-92", "ovs_interfaceid": "c17b4cfe-92b5-45cc-9593-bfdc74a7308e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 595.094346] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e4:27:19', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c17b4cfe-92b5-45cc-9593-bfdc74a7308e', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 595.102308] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Creating folder: Project (51a75cd8fdaa47f8b37e6c78769a63f8). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.108263] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-23b8a7f9-988a-43aa-9d4f-cb8cfa127bcc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.110161] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456683, 'name': CreateVM_Task, 'duration_secs': 1.307581} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 595.112045] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 595.112282] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.112899] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 595.113612] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 595.115409] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d4248162-9969-4883-b095-5387cad46842 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.121070] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Created folder: Project (51a75cd8fdaa47f8b37e6c78769a63f8) in parent group-v690366. [ 595.121070] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Creating folder: Instances. Parent ref: group-v690382. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.121070] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-89be3910-fe62-488d-b5bd-c166bf523c2b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.122337] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for the task: (returnval){ [ 595.122337] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]528620a9-f756-c388-a5ea-d76e6de385f2" [ 595.122337] env[67964]: _type = "Task" [ 595.122337] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 595.131758] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Created folder: Instances in parent group-v690382. [ 595.131758] env[67964]: DEBUG oslo.service.loopingcall [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 595.133336] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 595.133519] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]528620a9-f756-c388-a5ea-d76e6de385f2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.133748] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b8a5e063-fb13-43ca-a160-7cc85eb9cca8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.149087] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 595.159106] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 595.159106] env[67964]: value = "task-3456692" [ 595.159106] env[67964]: _type = "Task" [ 595.159106] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 595.166394] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456692, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.218214] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Updating instance_info_cache with network_info: [{"id": "8fb15d52-7eb3-4a14-9f66-f92e1c2e550a", "address": "fa:16:3e:8f:ee:e3", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fb15d52-7e", "ovs_interfaceid": "8fb15d52-7eb3-4a14-9f66-f92e1c2e550a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.237243] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Releasing lock "refresh_cache-371aeb17-ad59-4a01-88f7-466dfee8d293" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 595.237951] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Instance network_info: |[{"id": "8fb15d52-7eb3-4a14-9f66-f92e1c2e550a", "address": "fa:16:3e:8f:ee:e3", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fb15d52-7e", "ovs_interfaceid": "8fb15d52-7eb3-4a14-9f66-f92e1c2e550a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 595.238315] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8f:ee:e3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8fb15d52-7eb3-4a14-9f66-f92e1c2e550a', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 595.247405] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Creating folder: Project (2b0971e98f514aafb057e5c603c2137a). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.249012] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-87c516af-bbdf-491a-aa61-4994c74ca9af {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.267165] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Created folder: Project (2b0971e98f514aafb057e5c603c2137a) in parent group-v690366. [ 595.267390] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Creating folder: Instances. Parent ref: group-v690385. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.267599] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35fb9f32-ef39-496a-8c56-4d1db78ff595 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.275975] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Created folder: Instances in parent group-v690385. [ 595.276231] env[67964]: DEBUG oslo.service.loopingcall [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 595.276420] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 595.276848] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bce88422-1517-4481-97ef-5f69a6f1f7b3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.302576] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 595.302576] env[67964]: value = "task-3456695" [ 595.302576] env[67964]: _type = "Task" [ 595.302576] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 595.311203] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456695, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.361697] env[67964]: DEBUG nova.compute.manager [req-3350e974-d0cb-4dfe-81d3-65c3fb5a6935 req-3ce9b4a3-cf5d-43c0-99c7-110212f16fb2 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Received event network-vif-plugged-70e4e785-598e-48d0-9ffe-e31afffdb9d8 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 595.361697] env[67964]: DEBUG oslo_concurrency.lockutils [req-3350e974-d0cb-4dfe-81d3-65c3fb5a6935 req-3ce9b4a3-cf5d-43c0-99c7-110212f16fb2 service nova] Acquiring lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 595.361697] env[67964]: DEBUG oslo_concurrency.lockutils [req-3350e974-d0cb-4dfe-81d3-65c3fb5a6935 req-3ce9b4a3-cf5d-43c0-99c7-110212f16fb2 service nova] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 595.361847] env[67964]: DEBUG oslo_concurrency.lockutils [req-3350e974-d0cb-4dfe-81d3-65c3fb5a6935 req-3ce9b4a3-cf5d-43c0-99c7-110212f16fb2 service nova] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 595.362013] env[67964]: DEBUG nova.compute.manager [req-3350e974-d0cb-4dfe-81d3-65c3fb5a6935 req-3ce9b4a3-cf5d-43c0-99c7-110212f16fb2 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] No waiting events found dispatching network-vif-plugged-70e4e785-598e-48d0-9ffe-e31afffdb9d8 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 595.362189] env[67964]: WARNING nova.compute.manager [req-3350e974-d0cb-4dfe-81d3-65c3fb5a6935 req-3ce9b4a3-cf5d-43c0-99c7-110212f16fb2 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Received unexpected event network-vif-plugged-70e4e785-598e-48d0-9ffe-e31afffdb9d8 for instance with vm_state building and task_state spawning. [ 595.541381] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456689, 'name': CreateVM_Task, 'duration_secs': 0.578683} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 595.541381] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 595.541381] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.635421] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 595.636402] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 595.637090] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.637732] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 595.638114] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 595.638615] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ba2ca0d-40a8-473c-8301-62ead8c92c69 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.646014] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Waiting for the task: (returnval){ [ 595.646014] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52698bec-d767-8df9-76b6-1b36ebbe8a10" [ 595.646014] env[67964]: _type = "Task" [ 595.646014] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 595.656572] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52698bec-d767-8df9-76b6-1b36ebbe8a10, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.668474] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456692, 'name': CreateVM_Task, 'duration_secs': 0.337116} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 595.671576] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 595.672355] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.817691] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456695, 'name': CreateVM_Task, 'duration_secs': 0.316333} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 595.818048] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 595.818896] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.852029] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Updating instance_info_cache with network_info: [{"id": "e76f8188-1437-4e64-b9ad-e21ecb2951fc", "address": "fa:16:3e:ff:68:e1", "network": {"id": "3ba557bc-4f3d-4dd5-97c0-5303ca5b8c89", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-666317981-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b44c88280d149ddacbbde44b468049e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ea0fc1b-0424-46ec-bef5-6b57b7d184d8", "external-id": "nsx-vlan-transportzone-618", "segmentation_id": 618, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape76f8188-14", "ovs_interfaceid": "e76f8188-1437-4e64-b9ad-e21ecb2951fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.866171] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Releasing lock "refresh_cache-180338df-2738-4eeb-8610-cb130d04f6d2" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 595.868997] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Instance network_info: |[{"id": "e76f8188-1437-4e64-b9ad-e21ecb2951fc", "address": "fa:16:3e:ff:68:e1", "network": {"id": "3ba557bc-4f3d-4dd5-97c0-5303ca5b8c89", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-666317981-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b44c88280d149ddacbbde44b468049e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ea0fc1b-0424-46ec-bef5-6b57b7d184d8", "external-id": "nsx-vlan-transportzone-618", "segmentation_id": 618, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape76f8188-14", "ovs_interfaceid": "e76f8188-1437-4e64-b9ad-e21ecb2951fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 595.869252] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ff:68:e1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0ea0fc1b-0424-46ec-bef5-6b57b7d184d8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e76f8188-1437-4e64-b9ad-e21ecb2951fc', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 595.878664] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Creating folder: Project (9b44c88280d149ddacbbde44b468049e). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.879315] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a022afc5-1947-4a75-ac04-aee0647d4edb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.890059] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Created folder: Project (9b44c88280d149ddacbbde44b468049e) in parent group-v690366. [ 595.890265] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Creating folder: Instances. Parent ref: group-v690388. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.890501] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3d467eb8-47cc-43aa-9759-890d29bfb50d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.904093] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Created folder: Instances in parent group-v690388. [ 595.904339] env[67964]: DEBUG oslo.service.loopingcall [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 595.904665] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 595.904750] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-be758286-de97-4cf4-be49-c1d0cd5b763b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.936796] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 595.936796] env[67964]: value = "task-3456698" [ 595.936796] env[67964]: _type = "Task" [ 595.936796] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 595.947335] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456698, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 596.110954] env[67964]: DEBUG nova.network.neutron [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Updated VIF entry in instance network info cache for port 41b8e312-a79a-49c8-98fd-dca83de83b7c. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 596.110954] env[67964]: DEBUG nova.network.neutron [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Updating instance_info_cache with network_info: [{"id": "41b8e312-a79a-49c8-98fd-dca83de83b7c", "address": "fa:16:3e:2d:4f:c7", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap41b8e312-a7", "ovs_interfaceid": "41b8e312-a79a-49c8-98fd-dca83de83b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 596.122812] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Releasing lock "refresh_cache-6ebecddf-098f-447f-a350-6644b50f87f7" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 596.123787] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Received event network-vif-plugged-89d0f60e-26f7-4d09-9ae1-f3dd062ffee7 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 596.123787] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Acquiring lock "82096302-bbdd-49b4-bd19-bdf75343e03a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 596.123787] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Lock "82096302-bbdd-49b4-bd19-bdf75343e03a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 596.123787] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Lock "82096302-bbdd-49b4-bd19-bdf75343e03a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 596.124061] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] No waiting events found dispatching network-vif-plugged-89d0f60e-26f7-4d09-9ae1-f3dd062ffee7 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 596.124061] env[67964]: WARNING nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Received unexpected event network-vif-plugged-89d0f60e-26f7-4d09-9ae1-f3dd062ffee7 for instance with vm_state building and task_state spawning. [ 596.124132] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Received event network-vif-plugged-8862b639-9a44-49d0-946a-7752f0f8197c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 596.124312] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Acquiring lock "bd297ef0-fa45-43c1-ab4e-14bcce806b35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 596.125085] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Lock "bd297ef0-fa45-43c1-ab4e-14bcce806b35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 596.125085] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Lock "bd297ef0-fa45-43c1-ab4e-14bcce806b35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 596.125085] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] No waiting events found dispatching network-vif-plugged-8862b639-9a44-49d0-946a-7752f0f8197c {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 596.125085] env[67964]: WARNING nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Received unexpected event network-vif-plugged-8862b639-9a44-49d0-946a-7752f0f8197c for instance with vm_state building and task_state spawning. [ 596.125262] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Received event network-changed-89d0f60e-26f7-4d09-9ae1-f3dd062ffee7 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 596.129374] env[67964]: DEBUG nova.compute.manager [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Refreshing instance network info cache due to event network-changed-89d0f60e-26f7-4d09-9ae1-f3dd062ffee7. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 596.129648] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Acquiring lock "refresh_cache-82096302-bbdd-49b4-bd19-bdf75343e03a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.129799] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Acquired lock "refresh_cache-82096302-bbdd-49b4-bd19-bdf75343e03a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 596.130221] env[67964]: DEBUG nova.network.neutron [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Refreshing network info cache for port 89d0f60e-26f7-4d09-9ae1-f3dd062ffee7 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 596.158936] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 596.159299] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 596.159517] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.159740] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 596.160076] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 596.160341] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-90151b4a-8644-47e8-a176-dceaccb78fc4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 596.166911] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Waiting for the task: (returnval){ [ 596.166911] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52d4b5dc-b4c1-836a-e5a2-573ed3ec8a91" [ 596.166911] env[67964]: _type = "Task" [ 596.166911] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 596.178019] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52d4b5dc-b4c1-836a-e5a2-573ed3ec8a91, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 596.259505] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Successfully created port: 73ebab6a-7b2b-4d18-8993-a05deac26ddb {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 596.320895] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Successfully updated port: 1d2d7bc9-3c84-43c3-b627-c75db5dd3256 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 596.340306] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "refresh_cache-8b261c6e-741c-4d6c-9567-566af85cd68f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.340306] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquired lock "refresh_cache-8b261c6e-741c-4d6c-9567-566af85cd68f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 596.340306] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 596.448742] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456698, 'name': CreateVM_Task, 'duration_secs': 0.305369} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 596.448850] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 596.449921] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.584325] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 596.679961] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 596.680554] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 596.681085] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.681602] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 596.682108] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 596.682575] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7c167313-d4b3-4941-970b-ada2d59fa1a7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 596.688502] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Waiting for the task: (returnval){ [ 596.688502] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]526d2e1f-bc65-13a0-1735-1e68cfa98a59" [ 596.688502] env[67964]: _type = "Task" [ 596.688502] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 596.701136] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]526d2e1f-bc65-13a0-1735-1e68cfa98a59, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 597.203160] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 597.203160] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 597.203160] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 597.203160] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 597.204352] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 597.204678] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-134e8198-2005-4095-b112-1543ba7a5545 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.209635] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Waiting for the task: (returnval){ [ 597.209635] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]524d1ed7-4afc-8fc1-6d73-db8ac2313a95" [ 597.209635] env[67964]: _type = "Task" [ 597.209635] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 597.219872] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]524d1ed7-4afc-8fc1-6d73-db8ac2313a95, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 597.722639] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 597.722942] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 597.723178] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 597.745169] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Updating instance_info_cache with network_info: [{"id": "1d2d7bc9-3c84-43c3-b627-c75db5dd3256", "address": "fa:16:3e:f9:88:ec", "network": {"id": "8140da01-5247-483f-8a73-214e2182369e", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1878883737-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "28708549a6d54cabb9321784a134305a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d2d7bc9-3c", "ovs_interfaceid": "1d2d7bc9-3c84-43c3-b627-c75db5dd3256", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.767933] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Releasing lock "refresh_cache-8b261c6e-741c-4d6c-9567-566af85cd68f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 597.768288] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Instance network_info: |[{"id": "1d2d7bc9-3c84-43c3-b627-c75db5dd3256", "address": "fa:16:3e:f9:88:ec", "network": {"id": "8140da01-5247-483f-8a73-214e2182369e", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1878883737-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "28708549a6d54cabb9321784a134305a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d2d7bc9-3c", "ovs_interfaceid": "1d2d7bc9-3c84-43c3-b627-c75db5dd3256", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 597.769072] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f9:88:ec', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '298bb8ef-4765-494c-b157-7a349218bd1e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1d2d7bc9-3c84-43c3-b627-c75db5dd3256', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 597.778571] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Creating folder: Project (28708549a6d54cabb9321784a134305a). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.779271] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-31b7972b-65ae-4e7f-bf0a-bc09146ee55d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.792114] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Created folder: Project (28708549a6d54cabb9321784a134305a) in parent group-v690366. [ 597.792114] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Creating folder: Instances. Parent ref: group-v690391. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.792114] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5a4c3671-09d4-444a-9e96-950208559e38 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.802323] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Created folder: Instances in parent group-v690391. [ 597.802561] env[67964]: DEBUG oslo.service.loopingcall [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 597.802753] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 597.802960] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7a0fa02b-c624-422b-9536-3b184d8be511 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.824700] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 597.824700] env[67964]: value = "task-3456701" [ 597.824700] env[67964]: _type = "Task" [ 597.824700] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 597.836185] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456701, 'name': CreateVM_Task} progress is 5%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 597.934575] env[67964]: DEBUG nova.network.neutron [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Updated VIF entry in instance network info cache for port 89d0f60e-26f7-4d09-9ae1-f3dd062ffee7. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 597.934575] env[67964]: DEBUG nova.network.neutron [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Updating instance_info_cache with network_info: [{"id": "89d0f60e-26f7-4d09-9ae1-f3dd062ffee7", "address": "fa:16:3e:b4:61:90", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89d0f60e-26", "ovs_interfaceid": "89d0f60e-26f7-4d09-9ae1-f3dd062ffee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.950409] env[67964]: DEBUG oslo_concurrency.lockutils [req-2107bad5-c568-4110-b40a-89f7239208ff req-f8ae8bb3-94aa-4fb7-a474-eeabe7447bee service nova] Releasing lock "refresh_cache-82096302-bbdd-49b4-bd19-bdf75343e03a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 598.316089] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Received event network-changed-533485e1-3b77-4d77-af0f-65077a2275fe {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 598.316089] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Refreshing instance network info cache due to event network-changed-533485e1-3b77-4d77-af0f-65077a2275fe. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 598.316089] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquiring lock "refresh_cache-e55fdbbb-813d-427c-a53f-5be3fbeeb531" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 598.316089] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquired lock "refresh_cache-e55fdbbb-813d-427c-a53f-5be3fbeeb531" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 598.316236] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Refreshing network info cache for port 533485e1-3b77-4d77-af0f-65077a2275fe {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 598.342463] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456701, 'name': CreateVM_Task, 'duration_secs': 0.308466} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 598.342631] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 598.343351] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 598.344332] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 598.344704] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 598.347014] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9eb583b3-8c7d-41b3-8fed-ab2837c89784 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 598.351419] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Waiting for the task: (returnval){ [ 598.351419] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5246f0e1-3cb0-e2b3-199a-2adfa70f6cc3" [ 598.351419] env[67964]: _type = "Task" [ 598.351419] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 598.363026] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5246f0e1-3cb0-e2b3-199a-2adfa70f6cc3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 598.612432] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "6580c348-f5a4-4f20-a6fb-8942202a526e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.612432] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "6580c348-f5a4-4f20-a6fb-8942202a526e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 598.863777] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 598.864122] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 598.864701] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 598.938112] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Successfully updated port: 73ebab6a-7b2b-4d18-8993-a05deac26ddb {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 598.955272] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "refresh_cache-9c586d33-c563-45c7-8c54-1638a78a669c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 598.955421] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquired lock "refresh_cache-9c586d33-c563-45c7-8c54-1638a78a669c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 598.955572] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 599.202444] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "fed6991c-9b59-43bb-8cda-96053adb798b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.202444] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "fed6991c-9b59-43bb-8cda-96053adb798b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.248298] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 599.344282] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Updated VIF entry in instance network info cache for port 533485e1-3b77-4d77-af0f-65077a2275fe. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 599.344630] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Updating instance_info_cache with network_info: [{"id": "533485e1-3b77-4d77-af0f-65077a2275fe", "address": "fa:16:3e:2f:f3:b1", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap533485e1-3b", "ovs_interfaceid": "533485e1-3b77-4d77-af0f-65077a2275fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.359374] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Releasing lock "refresh_cache-e55fdbbb-813d-427c-a53f-5be3fbeeb531" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 599.359903] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Received event network-vif-plugged-c17b4cfe-92b5-45cc-9593-bfdc74a7308e {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 599.359903] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquiring lock "93509103-8c02-420d-bcaa-c2cf0847b1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.360111] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.360269] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 599.360427] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] No waiting events found dispatching network-vif-plugged-c17b4cfe-92b5-45cc-9593-bfdc74a7308e {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 599.360586] env[67964]: WARNING nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Received unexpected event network-vif-plugged-c17b4cfe-92b5-45cc-9593-bfdc74a7308e for instance with vm_state building and task_state spawning. [ 599.360802] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Received event network-vif-plugged-8fb15d52-7eb3-4a14-9f66-f92e1c2e550a {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 599.360967] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquiring lock "371aeb17-ad59-4a01-88f7-466dfee8d293-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.361152] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.361298] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 599.361447] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] No waiting events found dispatching network-vif-plugged-8fb15d52-7eb3-4a14-9f66-f92e1c2e550a {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 599.362355] env[67964]: WARNING nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Received unexpected event network-vif-plugged-8fb15d52-7eb3-4a14-9f66-f92e1c2e550a for instance with vm_state building and task_state spawning. [ 599.362355] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Received event network-changed-c17b4cfe-92b5-45cc-9593-bfdc74a7308e {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 599.362355] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Refreshing instance network info cache due to event network-changed-c17b4cfe-92b5-45cc-9593-bfdc74a7308e. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 599.362355] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquiring lock "refresh_cache-93509103-8c02-420d-bcaa-c2cf0847b1f0" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 599.362355] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquired lock "refresh_cache-93509103-8c02-420d-bcaa-c2cf0847b1f0" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 599.362619] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Refreshing network info cache for port c17b4cfe-92b5-45cc-9593-bfdc74a7308e {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 599.562051] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Received event network-changed-8862b639-9a44-49d0-946a-7752f0f8197c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 599.562265] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Refreshing instance network info cache due to event network-changed-8862b639-9a44-49d0-946a-7752f0f8197c. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 599.562474] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquiring lock "refresh_cache-bd297ef0-fa45-43c1-ab4e-14bcce806b35" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 599.562613] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquired lock "refresh_cache-bd297ef0-fa45-43c1-ab4e-14bcce806b35" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 599.562772] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Refreshing network info cache for port 8862b639-9a44-49d0-946a-7752f0f8197c {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 599.941365] env[67964]: DEBUG nova.compute.manager [req-7c621343-f1ba-4c69-8dfb-2835a4183e7c req-1887ce28-0319-4201-ae2f-d0340393ba36 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Received event network-vif-plugged-1d2d7bc9-3c84-43c3-b627-c75db5dd3256 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 599.943795] env[67964]: DEBUG oslo_concurrency.lockutils [req-7c621343-f1ba-4c69-8dfb-2835a4183e7c req-1887ce28-0319-4201-ae2f-d0340393ba36 service nova] Acquiring lock "8b261c6e-741c-4d6c-9567-566af85cd68f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.944433] env[67964]: DEBUG oslo_concurrency.lockutils [req-7c621343-f1ba-4c69-8dfb-2835a4183e7c req-1887ce28-0319-4201-ae2f-d0340393ba36 service nova] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.944883] env[67964]: DEBUG oslo_concurrency.lockutils [req-7c621343-f1ba-4c69-8dfb-2835a4183e7c req-1887ce28-0319-4201-ae2f-d0340393ba36 service nova] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 599.945178] env[67964]: DEBUG nova.compute.manager [req-7c621343-f1ba-4c69-8dfb-2835a4183e7c req-1887ce28-0319-4201-ae2f-d0340393ba36 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] No waiting events found dispatching network-vif-plugged-1d2d7bc9-3c84-43c3-b627-c75db5dd3256 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 599.945410] env[67964]: WARNING nova.compute.manager [req-7c621343-f1ba-4c69-8dfb-2835a4183e7c req-1887ce28-0319-4201-ae2f-d0340393ba36 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Received unexpected event network-vif-plugged-1d2d7bc9-3c84-43c3-b627-c75db5dd3256 for instance with vm_state building and task_state spawning. [ 599.963092] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Updated VIF entry in instance network info cache for port c17b4cfe-92b5-45cc-9593-bfdc74a7308e. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 599.963493] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Updating instance_info_cache with network_info: [{"id": "c17b4cfe-92b5-45cc-9593-bfdc74a7308e", "address": "fa:16:3e:e4:27:19", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.87", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc17b4cfe-92", "ovs_interfaceid": "c17b4cfe-92b5-45cc-9593-bfdc74a7308e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.975598] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Releasing lock "refresh_cache-93509103-8c02-420d-bcaa-c2cf0847b1f0" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 599.975598] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Received event network-changed-8fb15d52-7eb3-4a14-9f66-f92e1c2e550a {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 599.975802] env[67964]: DEBUG nova.compute.manager [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Refreshing instance network info cache due to event network-changed-8fb15d52-7eb3-4a14-9f66-f92e1c2e550a. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 599.976019] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquiring lock "refresh_cache-371aeb17-ad59-4a01-88f7-466dfee8d293" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 599.976230] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Acquired lock "refresh_cache-371aeb17-ad59-4a01-88f7-466dfee8d293" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 599.976523] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Refreshing network info cache for port 8fb15d52-7eb3-4a14-9f66-f92e1c2e550a {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 600.042763] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Updating instance_info_cache with network_info: [{"id": "73ebab6a-7b2b-4d18-8993-a05deac26ddb", "address": "fa:16:3e:7b:93:7f", "network": {"id": "4f0a0e17-236f-499d-8304-c60755a3439e", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1717422645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "503181247f834d34a1e788771dfcab0a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "170f3b82-5915-4e36-bce9-4664ebb6be5e", "external-id": "nsx-vlan-transportzone-33", "segmentation_id": 33, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73ebab6a-7b", "ovs_interfaceid": "73ebab6a-7b2b-4d18-8993-a05deac26ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.068017] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Releasing lock "refresh_cache-9c586d33-c563-45c7-8c54-1638a78a669c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 600.068017] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Instance network_info: |[{"id": "73ebab6a-7b2b-4d18-8993-a05deac26ddb", "address": "fa:16:3e:7b:93:7f", "network": {"id": "4f0a0e17-236f-499d-8304-c60755a3439e", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1717422645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "503181247f834d34a1e788771dfcab0a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "170f3b82-5915-4e36-bce9-4664ebb6be5e", "external-id": "nsx-vlan-transportzone-33", "segmentation_id": 33, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73ebab6a-7b", "ovs_interfaceid": "73ebab6a-7b2b-4d18-8993-a05deac26ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 600.068359] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:93:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '170f3b82-5915-4e36-bce9-4664ebb6be5e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '73ebab6a-7b2b-4d18-8993-a05deac26ddb', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 600.074320] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Creating folder: Project (503181247f834d34a1e788771dfcab0a). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 600.075356] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-47315040-d957-46f5-9bc5-a3a50a96544b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.093195] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Created folder: Project (503181247f834d34a1e788771dfcab0a) in parent group-v690366. [ 600.093255] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Creating folder: Instances. Parent ref: group-v690397. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 600.093489] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-98c9f285-d6ee-43b5-a475-b0bd5470aeb6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.102660] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Created folder: Instances in parent group-v690397. [ 600.102899] env[67964]: DEBUG oslo.service.loopingcall [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 600.103097] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 600.103304] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-77984582-802b-4083-8015-7e106ca053bf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.124793] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 600.124793] env[67964]: value = "task-3456708" [ 600.124793] env[67964]: _type = "Task" [ 600.124793] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 600.134342] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456708, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 600.360631] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Updated VIF entry in instance network info cache for port 8862b639-9a44-49d0-946a-7752f0f8197c. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 600.360868] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Updating instance_info_cache with network_info: [{"id": "8862b639-9a44-49d0-946a-7752f0f8197c", "address": "fa:16:3e:cf:0a:70", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.244", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8862b639-9a", "ovs_interfaceid": "8862b639-9a44-49d0-946a-7752f0f8197c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.379928] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Releasing lock "refresh_cache-bd297ef0-fa45-43c1-ab4e-14bcce806b35" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 600.380258] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Received event network-changed-70e4e785-598e-48d0-9ffe-e31afffdb9d8 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 600.380444] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Refreshing instance network info cache due to event network-changed-70e4e785-598e-48d0-9ffe-e31afffdb9d8. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 600.380688] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquiring lock "refresh_cache-8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 600.380955] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquired lock "refresh_cache-8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 600.381241] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Refreshing network info cache for port 70e4e785-598e-48d0-9ffe-e31afffdb9d8 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 600.639134] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456708, 'name': CreateVM_Task, 'duration_secs': 0.315253} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 600.639355] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 600.640400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 600.640400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 600.640775] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 600.640904] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-38df2657-cc16-4afc-8b7c-2468ce5b1188 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.645607] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Waiting for the task: (returnval){ [ 600.645607] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52fa1912-6f9c-7cc7-1249-a621c4336248" [ 600.645607] env[67964]: _type = "Task" [ 600.645607] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 600.655990] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52fa1912-6f9c-7cc7-1249-a621c4336248, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 600.921250] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Updated VIF entry in instance network info cache for port 8fb15d52-7eb3-4a14-9f66-f92e1c2e550a. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 600.921250] env[67964]: DEBUG nova.network.neutron [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Updating instance_info_cache with network_info: [{"id": "8fb15d52-7eb3-4a14-9f66-f92e1c2e550a", "address": "fa:16:3e:8f:ee:e3", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fb15d52-7e", "ovs_interfaceid": "8fb15d52-7eb3-4a14-9f66-f92e1c2e550a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.940355] env[67964]: DEBUG oslo_concurrency.lockutils [req-148a78db-0608-45e2-9430-18a7928a5323 req-7a26f424-d95a-406c-ba6c-02b04b98da44 service nova] Releasing lock "refresh_cache-371aeb17-ad59-4a01-88f7-466dfee8d293" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 601.093958] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Updated VIF entry in instance network info cache for port 70e4e785-598e-48d0-9ffe-e31afffdb9d8. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 601.094223] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Updating instance_info_cache with network_info: [{"id": "70e4e785-598e-48d0-9ffe-e31afffdb9d8", "address": "fa:16:3e:6b:0c:bf", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.199", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap70e4e785-59", "ovs_interfaceid": "70e4e785-598e-48d0-9ffe-e31afffdb9d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.106778] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Releasing lock "refresh_cache-8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 601.107055] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Received event network-vif-plugged-e76f8188-1437-4e64-b9ad-e21ecb2951fc {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 601.107247] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquiring lock "180338df-2738-4eeb-8610-cb130d04f6d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.107437] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Lock "180338df-2738-4eeb-8610-cb130d04f6d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.107593] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Lock "180338df-2738-4eeb-8610-cb130d04f6d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 601.107756] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] No waiting events found dispatching network-vif-plugged-e76f8188-1437-4e64-b9ad-e21ecb2951fc {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 601.107932] env[67964]: WARNING nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Received unexpected event network-vif-plugged-e76f8188-1437-4e64-b9ad-e21ecb2951fc for instance with vm_state building and task_state spawning. [ 601.108115] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Received event network-changed-e76f8188-1437-4e64-b9ad-e21ecb2951fc {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 601.108263] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Refreshing instance network info cache due to event network-changed-e76f8188-1437-4e64-b9ad-e21ecb2951fc. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 601.108441] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquiring lock "refresh_cache-180338df-2738-4eeb-8610-cb130d04f6d2" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 601.109030] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquired lock "refresh_cache-180338df-2738-4eeb-8610-cb130d04f6d2" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 601.109030] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Refreshing network info cache for port e76f8188-1437-4e64-b9ad-e21ecb2951fc {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 601.163650] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 601.163650] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 601.163650] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 601.451220] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Updated VIF entry in instance network info cache for port e76f8188-1437-4e64-b9ad-e21ecb2951fc. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 601.452054] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Updating instance_info_cache with network_info: [{"id": "e76f8188-1437-4e64-b9ad-e21ecb2951fc", "address": "fa:16:3e:ff:68:e1", "network": {"id": "3ba557bc-4f3d-4dd5-97c0-5303ca5b8c89", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-666317981-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b44c88280d149ddacbbde44b468049e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ea0fc1b-0424-46ec-bef5-6b57b7d184d8", "external-id": "nsx-vlan-transportzone-618", "segmentation_id": 618, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape76f8188-14", "ovs_interfaceid": "e76f8188-1437-4e64-b9ad-e21ecb2951fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.465181] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Releasing lock "refresh_cache-180338df-2738-4eeb-8610-cb130d04f6d2" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 601.465332] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Received event network-changed-1d2d7bc9-3c84-43c3-b627-c75db5dd3256 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 601.465527] env[67964]: DEBUG nova.compute.manager [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Refreshing instance network info cache due to event network-changed-1d2d7bc9-3c84-43c3-b627-c75db5dd3256. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 601.466236] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquiring lock "refresh_cache-8b261c6e-741c-4d6c-9567-566af85cd68f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 601.466236] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Acquired lock "refresh_cache-8b261c6e-741c-4d6c-9567-566af85cd68f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 601.466236] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Refreshing network info cache for port 1d2d7bc9-3c84-43c3-b627-c75db5dd3256 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 601.726475] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "707828f6-0267-42ff-95e5-6b328382b017" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.727869] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "707828f6-0267-42ff-95e5-6b328382b017" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.883782] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Updated VIF entry in instance network info cache for port 1d2d7bc9-3c84-43c3-b627-c75db5dd3256. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 601.884262] env[67964]: DEBUG nova.network.neutron [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Updating instance_info_cache with network_info: [{"id": "1d2d7bc9-3c84-43c3-b627-c75db5dd3256", "address": "fa:16:3e:f9:88:ec", "network": {"id": "8140da01-5247-483f-8a73-214e2182369e", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1878883737-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "28708549a6d54cabb9321784a134305a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d2d7bc9-3c", "ovs_interfaceid": "1d2d7bc9-3c84-43c3-b627-c75db5dd3256", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.897750] env[67964]: DEBUG oslo_concurrency.lockutils [req-46c7e3fd-5e11-4ea1-b613-e51e9d4015b2 req-37bbb9cb-9b0a-43b1-9f69-aed3b9649974 service nova] Releasing lock "refresh_cache-8b261c6e-741c-4d6c-9567-566af85cd68f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 602.392177] env[67964]: DEBUG nova.compute.manager [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Received event network-vif-plugged-73ebab6a-7b2b-4d18-8993-a05deac26ddb {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 602.392443] env[67964]: DEBUG oslo_concurrency.lockutils [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] Acquiring lock "9c586d33-c563-45c7-8c54-1638a78a669c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.392480] env[67964]: DEBUG oslo_concurrency.lockutils [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] Lock "9c586d33-c563-45c7-8c54-1638a78a669c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.392646] env[67964]: DEBUG oslo_concurrency.lockutils [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] Lock "9c586d33-c563-45c7-8c54-1638a78a669c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 602.392807] env[67964]: DEBUG nova.compute.manager [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] No waiting events found dispatching network-vif-plugged-73ebab6a-7b2b-4d18-8993-a05deac26ddb {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 602.393437] env[67964]: WARNING nova.compute.manager [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Received unexpected event network-vif-plugged-73ebab6a-7b2b-4d18-8993-a05deac26ddb for instance with vm_state building and task_state spawning. [ 602.394134] env[67964]: DEBUG nova.compute.manager [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Received event network-changed-73ebab6a-7b2b-4d18-8993-a05deac26ddb {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 602.396747] env[67964]: DEBUG nova.compute.manager [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Refreshing instance network info cache due to event network-changed-73ebab6a-7b2b-4d18-8993-a05deac26ddb. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 602.396747] env[67964]: DEBUG oslo_concurrency.lockutils [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] Acquiring lock "refresh_cache-9c586d33-c563-45c7-8c54-1638a78a669c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 602.396747] env[67964]: DEBUG oslo_concurrency.lockutils [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] Acquired lock "refresh_cache-9c586d33-c563-45c7-8c54-1638a78a669c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 602.396747] env[67964]: DEBUG nova.network.neutron [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Refreshing network info cache for port 73ebab6a-7b2b-4d18-8993-a05deac26ddb {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 603.158505] env[67964]: DEBUG nova.network.neutron [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Updated VIF entry in instance network info cache for port 73ebab6a-7b2b-4d18-8993-a05deac26ddb. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 603.158505] env[67964]: DEBUG nova.network.neutron [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Updating instance_info_cache with network_info: [{"id": "73ebab6a-7b2b-4d18-8993-a05deac26ddb", "address": "fa:16:3e:7b:93:7f", "network": {"id": "4f0a0e17-236f-499d-8304-c60755a3439e", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1717422645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "503181247f834d34a1e788771dfcab0a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "170f3b82-5915-4e36-bce9-4664ebb6be5e", "external-id": "nsx-vlan-transportzone-33", "segmentation_id": 33, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73ebab6a-7b", "ovs_interfaceid": "73ebab6a-7b2b-4d18-8993-a05deac26ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.172833] env[67964]: DEBUG oslo_concurrency.lockutils [req-9e274b84-0413-4fbe-bbdb-4d01fc7e4e64 req-56738ba9-c89d-4332-8cf6-5fedb79d5979 service nova] Releasing lock "refresh_cache-9c586d33-c563-45c7-8c54-1638a78a669c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 608.609681] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 608.610033] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.831567] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e75c5a43-89ad-429d-916e-af1268f0c030 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.831567] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e75c5a43-89ad-429d-916e-af1268f0c030 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 611.589981] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5e85a1ef-d577-48c0-b398-bc3ae3f17bd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "38541c24-fc6b-4385-91bf-de25df66a798" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 611.590325] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5e85a1ef-d577-48c0-b398-bc3ae3f17bd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "38541c24-fc6b-4385-91bf-de25df66a798" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 612.186902] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dc4fe31c-7623-462b-8302-75835785f8ac tempest-VolumesAssistedSnapshotsTest-1589554248 tempest-VolumesAssistedSnapshotsTest-1589554248-project-member] Acquiring lock "068b288c-194b-4d2c-89c3-8adb7d628cc7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 612.187278] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dc4fe31c-7623-462b-8302-75835785f8ac tempest-VolumesAssistedSnapshotsTest-1589554248 tempest-VolumesAssistedSnapshotsTest-1589554248-project-member] Lock "068b288c-194b-4d2c-89c3-8adb7d628cc7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.581916] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7d77b253-0fe5-4cc4-9479-7f7d1f381d3d tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "834fdc0b-5b2f-4374-a77a-de970c10e125" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.581916] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7d77b253-0fe5-4cc4-9479-7f7d1f381d3d tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "834fdc0b-5b2f-4374-a77a-de970c10e125" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 616.030746] env[67964]: DEBUG oslo_concurrency.lockutils [None req-54a71b18-cf24-437d-82bb-b8086dd8588a tempest-ServersAdmin275Test-280874369 tempest-ServersAdmin275Test-280874369-project-member] Acquiring lock "f50fc747-d3b4-456c-b86f-a086a7968329" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 616.034181] env[67964]: DEBUG oslo_concurrency.lockutils [None req-54a71b18-cf24-437d-82bb-b8086dd8588a tempest-ServersAdmin275Test-280874369 tempest-ServersAdmin275Test-280874369-project-member] Lock "f50fc747-d3b4-456c-b86f-a086a7968329" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 616.355667] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5250760-6ed9-47eb-8175-6ce343c66ac9 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] Acquiring lock "f9e00f5a-036a-4141-b4ae-bda4c8a4c11b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 616.356191] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5250760-6ed9-47eb-8175-6ce343c66ac9 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] Lock "f9e00f5a-036a-4141-b4ae-bda4c8a4c11b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 617.127825] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2a789350-41a4-4acc-8b9a-6b83ffe27ae6 tempest-ImagesNegativeTestJSON-124339597 tempest-ImagesNegativeTestJSON-124339597-project-member] Acquiring lock "188890b5-2189-4499-9856-22dc65b6c6f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 617.127825] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2a789350-41a4-4acc-8b9a-6b83ffe27ae6 tempest-ImagesNegativeTestJSON-124339597 tempest-ImagesNegativeTestJSON-124339597-project-member] Lock "188890b5-2189-4499-9856-22dc65b6c6f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 618.078400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3274f14-016e-4655-bde3-8f9ae729ec9e tempest-ServersWithSpecificFlavorTestJSON-1528302942 tempest-ServersWithSpecificFlavorTestJSON-1528302942-project-member] Acquiring lock "d84f0e97-24d3-4b0b-8eff-51cf6bfd980c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 618.078716] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3274f14-016e-4655-bde3-8f9ae729ec9e tempest-ServersWithSpecificFlavorTestJSON-1528302942 tempest-ServersWithSpecificFlavorTestJSON-1528302942-project-member] Lock "d84f0e97-24d3-4b0b-8eff-51cf6bfd980c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 619.488472] env[67964]: DEBUG oslo_concurrency.lockutils [None req-51b9fbd5-60e6-4b63-9b0e-7070e6505d41 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] Acquiring lock "f963e0c6-8a3d-4872-8cae-07fff845b77f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 619.488796] env[67964]: DEBUG oslo_concurrency.lockutils [None req-51b9fbd5-60e6-4b63-9b0e-7070e6505d41 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] Lock "f963e0c6-8a3d-4872-8cae-07fff845b77f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 620.585659] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6a025ec9-bcde-4f7b-b88f-736df426b959 tempest-ServersTestBootFromVolume-544690349 tempest-ServersTestBootFromVolume-544690349-project-member] Acquiring lock "8028a7dd-4002-4db3-a738-3926c1d2340e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 620.585931] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6a025ec9-bcde-4f7b-b88f-736df426b959 tempest-ServersTestBootFromVolume-544690349 tempest-ServersTestBootFromVolume-544690349-project-member] Lock "8028a7dd-4002-4db3-a738-3926c1d2340e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.639394] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9770d661-95f0-420f-aec2-ae8674b60e20 tempest-FloatingIPsAssociationTestJSON-985246946 tempest-FloatingIPsAssociationTestJSON-985246946-project-member] Acquiring lock "743b56db-49a0-4af7-96bc-a3fc6025fa19" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 621.639394] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9770d661-95f0-420f-aec2-ae8674b60e20 tempest-FloatingIPsAssociationTestJSON-985246946 tempest-FloatingIPsAssociationTestJSON-985246946-project-member] Lock "743b56db-49a0-4af7-96bc-a3fc6025fa19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.644737] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cd0282c1-c34c-42b8-9f6a-3c6745b2c172 tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] Acquiring lock "45e517a6-1ef1-4082-b5f9-24a9c932630c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 621.644737] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cd0282c1-c34c-42b8-9f6a-3c6745b2c172 tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] Lock "45e517a6-1ef1-4082-b5f9-24a9c932630c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.610838] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9b478b52-92e0-4f1b-b4f8-de3b304cfe35 tempest-ServerGroupTestJSON-612408333 tempest-ServerGroupTestJSON-612408333-project-member] Acquiring lock "5b7a605b-7521-40c3-92d1-ce5487f6fedd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.610838] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9b478b52-92e0-4f1b-b4f8-de3b304cfe35 tempest-ServerGroupTestJSON-612408333 tempest-ServerGroupTestJSON-612408333-project-member] Lock "5b7a605b-7521-40c3-92d1-ce5487f6fedd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 636.398820] env[67964]: WARNING oslo_vmware.rw_handles [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 636.398820] env[67964]: ERROR oslo_vmware.rw_handles [ 636.399605] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 636.401722] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 636.404050] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Copying Virtual Disk [datastore1] vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/2bdf92bf-bf35-40b1-86aa-e40512c8f96b/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 636.405766] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-575d6676-8dc4-4791-9135-a3e91af7b089 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.414931] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Waiting for the task: (returnval){ [ 636.414931] env[67964]: value = "task-3456716" [ 636.414931] env[67964]: _type = "Task" [ 636.414931] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 636.424249] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Task: {'id': task-3456716, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 636.927826] env[67964]: DEBUG oslo_vmware.exceptions [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 636.927826] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 636.930167] env[67964]: ERROR nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 636.930167] env[67964]: Faults: ['InvalidArgument'] [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Traceback (most recent call last): [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] yield resources [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self.driver.spawn(context, instance, image_meta, [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self._fetch_image_if_missing(context, vi) [ 636.930167] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] image_cache(vi, tmp_image_ds_loc) [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] vm_util.copy_virtual_disk( [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] session._wait_for_task(vmdk_copy_task) [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] return self.wait_for_task(task_ref) [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] return evt.wait() [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] result = hub.switch() [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 636.930565] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] return self.greenlet.switch() [ 636.930998] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 636.930998] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self.f(*self.args, **self.kw) [ 636.930998] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 636.930998] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] raise exceptions.translate_fault(task_info.error) [ 636.930998] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 636.930998] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Faults: ['InvalidArgument'] [ 636.930998] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] [ 636.930998] env[67964]: INFO nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Terminating instance [ 636.932137] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 636.932326] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 636.932978] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 636.933189] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 636.933415] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fc3697f3-6cb7-4068-acd3-09643cf59d95 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.936151] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc7e7348-f838-4b4c-9a49-af49396a9936 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.944564] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 636.944795] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4f65b9fb-733b-4129-b514-2c758f127213 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.947167] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 636.947402] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 636.948330] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-40ceb5ad-ef85-40e8-88d0-73091cab9e45 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.955059] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Waiting for the task: (returnval){ [ 636.955059] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52702013-5338-3508-0fb8-9615f0395e51" [ 636.955059] env[67964]: _type = "Task" [ 636.955059] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 636.968358] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 636.968592] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Creating directory with path [datastore1] vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 636.968808] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-476c2559-d7d9-471e-8ff4-666e34746784 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.989354] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Created directory with path [datastore1] vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 636.992095] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Fetch image to [datastore1] vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 636.992095] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 636.992095] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85c5c1d5-4c66-43ec-a7e4-aac588204d85 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.000986] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-920cd21a-9846-41cc-8153-c7e05ec29aae {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.017068] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f139e6f2-c8c7-45b0-94fe-0f680a2d166f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.022487] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 637.022701] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 637.022872] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Deleting the datastore file [datastore1] 6ebecddf-098f-447f-a350-6644b50f87f7 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 637.023588] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-68c598c9-a105-447d-b195-d6aec62cede7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.059551] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e796bbf-1bfa-430a-9306-05e1150b8c39 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.067574] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Waiting for the task: (returnval){ [ 637.067574] env[67964]: value = "task-3456718" [ 637.067574] env[67964]: _type = "Task" [ 637.067574] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 637.086500] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-958f3e73-f2a5-4591-b6ae-b131b9ac50a4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.100771] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Task: {'id': task-3456718, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 637.119401] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 637.234735] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 637.299797] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 637.299916] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 637.580079] env[67964]: DEBUG oslo_vmware.api [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Task: {'id': task-3456718, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067543} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 637.580721] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 637.580721] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 637.580721] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 637.581276] env[67964]: INFO nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Took 0.65 seconds to destroy the instance on the hypervisor. [ 637.584748] env[67964]: DEBUG nova.compute.claims [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 637.585151] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.585151] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 638.084019] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-616343fd-04b7-4ab1-8b19-bb603ed13d0f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.091773] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0317ad4-376a-47d2-a381-0da5a476d276 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.121563] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d072e0aa-98f0-4d51-be1a-ddea771d50ff {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.128927] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bf7a4f1-0d67-4247-af9d-4770ba53d1b8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.143399] env[67964]: DEBUG nova.compute.provider_tree [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 638.158898] env[67964]: DEBUG nova.scheduler.client.report [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 638.176556] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.589s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 638.176556] env[67964]: ERROR nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 638.176556] env[67964]: Faults: ['InvalidArgument'] [ 638.176556] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Traceback (most recent call last): [ 638.176556] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 638.176556] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self.driver.spawn(context, instance, image_meta, [ 638.176556] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 638.176556] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 638.176556] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 638.176556] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self._fetch_image_if_missing(context, vi) [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] image_cache(vi, tmp_image_ds_loc) [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] vm_util.copy_virtual_disk( [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] session._wait_for_task(vmdk_copy_task) [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] return self.wait_for_task(task_ref) [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] return evt.wait() [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] result = hub.switch() [ 638.177025] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] return self.greenlet.switch() [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] self.f(*self.args, **self.kw) [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] raise exceptions.translate_fault(task_info.error) [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Faults: ['InvalidArgument'] [ 638.177451] env[67964]: ERROR nova.compute.manager [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] [ 638.177451] env[67964]: DEBUG nova.compute.utils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 638.183390] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Build of instance 6ebecddf-098f-447f-a350-6644b50f87f7 was re-scheduled: A specified parameter was not correct: fileType [ 638.183390] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 638.183390] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 638.183390] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 638.183390] env[67964]: DEBUG nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 638.183802] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 638.797939] env[67964]: DEBUG nova.network.neutron [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.816919] env[67964]: INFO nova.compute.manager [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] [instance: 6ebecddf-098f-447f-a350-6644b50f87f7] Took 0.63 seconds to deallocate network for instance. [ 638.944315] env[67964]: INFO nova.scheduler.client.report [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Deleted allocations for instance 6ebecddf-098f-447f-a350-6644b50f87f7 [ 638.980932] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c10fc361-7b7f-40ec-9231-5fd0254e3126 tempest-ServerExternalEventsTest-805349393 tempest-ServerExternalEventsTest-805349393-project-member] Lock "6ebecddf-098f-447f-a350-6644b50f87f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 60.942s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 639.011593] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 639.092509] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.092509] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.096175] env[67964]: INFO nova.compute.claims [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 639.278059] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.278059] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.461453] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa031614-c845-4612-952a-5ada591a2966 tempest-AttachInterfacesUnderV243Test-544305701 tempest-AttachInterfacesUnderV243Test-544305701-project-member] Acquiring lock "b9a04994-804b-47b8-bc9f-cf4f18f27f5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.461674] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa031614-c845-4612-952a-5ada591a2966 tempest-AttachInterfacesUnderV243Test-544305701 tempest-AttachInterfacesUnderV243Test-544305701-project-member] Lock "b9a04994-804b-47b8-bc9f-cf4f18f27f5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.608121] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f2a45e-6262-4ee6-8ff7-5c3f6f73dd95 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.623017] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2db16cd3-d5cc-423a-b3c6-11fcd0e0507a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.654548] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-271a550e-2e7f-493c-b064-51fc34279952 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.664652] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35baa8c7-aa59-4cb2-bd40-41c5abb2c212 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.678646] env[67964]: DEBUG nova.compute.provider_tree [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 639.694257] env[67964]: DEBUG nova.scheduler.client.report [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 639.711872] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.619s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 639.712682] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 639.806289] env[67964]: DEBUG nova.compute.utils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 639.809242] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Not allocating networking since 'none' was specified. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 639.822899] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 639.908527] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 639.939018] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 639.939018] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 639.939018] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 639.939440] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 639.939440] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 639.939440] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 639.939440] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 639.939440] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 639.939753] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 639.939753] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 639.939753] env[67964]: DEBUG nova.virt.hardware [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 639.939753] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33916630-9f10-45af-863b-7096e4fdf540 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.947507] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0040809-3faa-4e15-bb18-ad99ff6b23f7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.963601] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance VIF info [] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 639.969368] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Creating folder: Project (ddbd70bf64f84678a238a604215cc274). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 639.969649] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9d2ce8ee-74c3-4c16-9ed3-5727da38b909 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.978995] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Created folder: Project (ddbd70bf64f84678a238a604215cc274) in parent group-v690366. [ 639.982755] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Creating folder: Instances. Parent ref: group-v690401. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 639.982755] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-457b2a64-802b-4523-b8d1-71010441b079 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.988237] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Created folder: Instances in parent group-v690401. [ 639.988485] env[67964]: DEBUG oslo.service.loopingcall [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 639.988629] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 639.988821] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-df0acf72-6791-4878-8897-5a6fc95042c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.005105] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 640.005105] env[67964]: value = "task-3456721" [ 640.005105] env[67964]: _type = "Task" [ 640.005105] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.013287] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456721, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.517400] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456721, 'name': CreateVM_Task, 'duration_secs': 0.352527} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 640.517400] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 640.517400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.517400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.517400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 640.517705] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e7ccbcb-12a3-47ba-9f2a-66cfcc360f72 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.523181] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Waiting for the task: (returnval){ [ 640.523181] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ee7242-138f-ac7a-fcd4-361117d9b058" [ 640.523181] env[67964]: _type = "Task" [ 640.523181] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.532219] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ee7242-138f-ac7a-fcd4-361117d9b058, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.545420] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6858452e-abec-4bc0-8cfa-c37aa7286f1e tempest-ServerActionsTestOtherB-968802063 tempest-ServerActionsTestOtherB-968802063-project-member] Acquiring lock "658d81c6-dd54-4af5-b51f-6b0ce8fb9336" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.545627] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6858452e-abec-4bc0-8cfa-c37aa7286f1e tempest-ServerActionsTestOtherB-968802063 tempest-ServerActionsTestOtherB-968802063-project-member] Lock "658d81c6-dd54-4af5-b51f-6b0ce8fb9336" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 641.039656] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 641.040019] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 641.042475] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 644.975945] env[67964]: DEBUG oslo_concurrency.lockutils [None req-301e0b5c-07bc-43e1-892c-706f071bae3d tempest-ServersV294TestFqdnHostnames-1254988542 tempest-ServersV294TestFqdnHostnames-1254988542-project-member] Acquiring lock "9dc1cdba-3991-4ee2-b92b-2800e17f07a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.976522] env[67964]: DEBUG oslo_concurrency.lockutils [None req-301e0b5c-07bc-43e1-892c-706f071bae3d tempest-ServersV294TestFqdnHostnames-1254988542 tempest-ServersV294TestFqdnHostnames-1254988542-project-member] Lock "9dc1cdba-3991-4ee2-b92b-2800e17f07a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.109736] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Acquiring lock "9f51b0ff-a0fc-4a94-9d1e-578347c2f776" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.110088] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "9f51b0ff-a0fc-4a94-9d1e-578347c2f776" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.137065] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Acquiring lock "8b5afa56-a56a-4990-9ba7-2c0955579a65" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.137373] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "8b5afa56-a56a-4990-9ba7-2c0955579a65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.280016] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8110f381-0d67-4dd9-9635-e69c1173f6e6 tempest-ServerActionsTestOtherA-1206453540 tempest-ServerActionsTestOtherA-1206453540-project-member] Acquiring lock "dcb0d0da-a987-46b2-be64-672ee3200eab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.280249] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8110f381-0d67-4dd9-9635-e69c1173f6e6 tempest-ServerActionsTestOtherA-1206453540 tempest-ServerActionsTestOtherA-1206453540-project-member] Lock "dcb0d0da-a987-46b2-be64-672ee3200eab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.399489] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 654.426986] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 654.426986] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 654.426986] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 654.426986] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 654.426986] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 654.439705] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.439946] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.440602] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 654.440791] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 654.442169] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3281c7c-33fd-414c-add6-49d107f41eca {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.451481] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0656b1b1-c4c8-46fe-9eb1-c121c42255a7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.467237] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e82043e9-f750-42f2-84a6-b52fb25a0442 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.474715] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-852a70c8-e253-4404-8cec-25e9d3e19a1d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.510518] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180902MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 654.510684] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.510875] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.587254] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance e55fdbbb-813d-427c-a53f-5be3fbeeb531 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588094] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588094] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 371aeb17-ad59-4a01-88f7-466dfee8d293 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588094] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 82096302-bbdd-49b4-bd19-bdf75343e03a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588094] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bd297ef0-fa45-43c1-ab4e-14bcce806b35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588307] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588307] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 180338df-2738-4eeb-8610-cb130d04f6d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588307] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588409] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.588443] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 654.618672] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.646237] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.662930] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.681639] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.699702] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 38541c24-fc6b-4385-91bf-de25df66a798 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.715718] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 068b288c-194b-4d2c-89c3-8adb7d628cc7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.727276] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 834fdc0b-5b2f-4374-a77a-de970c10e125 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.739584] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f50fc747-d3b4-456c-b86f-a086a7968329 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.754092] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f9e00f5a-036a-4141-b4ae-bda4c8a4c11b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.764966] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 188890b5-2189-4499-9856-22dc65b6c6f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.777281] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d84f0e97-24d3-4b0b-8eff-51cf6bfd980c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.790321] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f963e0c6-8a3d-4872-8cae-07fff845b77f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.803330] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8028a7dd-4002-4db3-a738-3926c1d2340e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.814092] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 743b56db-49a0-4af7-96bc-a3fc6025fa19 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.846202] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 45e517a6-1ef1-4082-b5f9-24a9c932630c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.857057] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5b7a605b-7521-40c3-92d1-ce5487f6fedd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.867959] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.882257] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance b9a04994-804b-47b8-bc9f-cf4f18f27f5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.894046] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 658d81c6-dd54-4af5-b51f-6b0ce8fb9336 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.904116] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9dc1cdba-3991-4ee2-b92b-2800e17f07a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.914716] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9f51b0ff-a0fc-4a94-9d1e-578347c2f776 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.926631] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b5afa56-a56a-4990-9ba7-2c0955579a65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.937785] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance dcb0d0da-a987-46b2-be64-672ee3200eab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 654.938249] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 654.938613] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 655.418802] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0481a525-d4d1-4f4b-b563-ce471af538c9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.425245] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a3f222-12f5-4cd4-aa59-57241368914e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.458963] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d16aecd4-b239-42af-bbaf-b244725e8cfd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.467867] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f97c486f-f91e-481f-bdf8-5f3b856fa53d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.483534] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 655.492038] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 655.510531] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 655.510801] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 655.692109] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5c869298-aaf3-4224-8d6d-dbd99e7c1448 tempest-InstanceActionsV221TestJSON-959559326 tempest-InstanceActionsV221TestJSON-959559326-project-member] Acquiring lock "2be271b8-775a-4c51-aa27-75a6a29e270b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 655.692376] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5c869298-aaf3-4224-8d6d-dbd99e7c1448 tempest-InstanceActionsV221TestJSON-959559326 tempest-InstanceActionsV221TestJSON-959559326-project-member] Lock "2be271b8-775a-4c51-aa27-75a6a29e270b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 655.886557] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 655.887375] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 655.887375] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 655.887375] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 655.909581] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.909740] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910339] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910339] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910339] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910597] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910597] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910686] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910802] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.910933] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 655.911076] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 655.911676] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 655.911747] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 655.911880] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 682.364484] env[67964]: WARNING oslo_vmware.rw_handles [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 682.364484] env[67964]: ERROR oslo_vmware.rw_handles [ 682.364988] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 682.366521] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 682.366794] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Copying Virtual Disk [datastore1] vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/ec0617bb-16c4-49cb-90fd-fe26ed3e590a/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 682.367124] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ea34da0c-f6d1-4aa6-b537-aa1a03e7230a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.375639] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Waiting for the task: (returnval){ [ 682.375639] env[67964]: value = "task-3456722" [ 682.375639] env[67964]: _type = "Task" [ 682.375639] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 682.383449] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Task: {'id': task-3456722, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 682.886746] env[67964]: DEBUG oslo_vmware.exceptions [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 682.887059] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 682.887712] env[67964]: ERROR nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 682.887712] env[67964]: Faults: ['InvalidArgument'] [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Traceback (most recent call last): [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] yield resources [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self.driver.spawn(context, instance, image_meta, [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self._vmops.spawn(context, instance, image_meta, injected_files, [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self._fetch_image_if_missing(context, vi) [ 682.887712] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] image_cache(vi, tmp_image_ds_loc) [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] vm_util.copy_virtual_disk( [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] session._wait_for_task(vmdk_copy_task) [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] return self.wait_for_task(task_ref) [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] return evt.wait() [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] result = hub.switch() [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 682.888104] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] return self.greenlet.switch() [ 682.888466] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 682.888466] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self.f(*self.args, **self.kw) [ 682.888466] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 682.888466] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] raise exceptions.translate_fault(task_info.error) [ 682.888466] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 682.888466] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Faults: ['InvalidArgument'] [ 682.888466] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] [ 682.888466] env[67964]: INFO nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Terminating instance [ 682.889786] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 682.889786] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 682.890054] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-db33a72d-1cfe-4433-ab9b-05750d8f2602 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.892798] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 682.892988] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 682.893730] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44579e30-3685-408f-98e5-c173992196de {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.900784] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 682.901011] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-15327adb-2939-4604-aeed-c2a0abb52c9d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.903236] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 682.903406] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 682.904350] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bf0ad21a-34d1-49b5-96d2-0f2aa76895b1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.908996] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Waiting for the task: (returnval){ [ 682.908996] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52171a8b-754c-8632-2a4d-472445e0432d" [ 682.908996] env[67964]: _type = "Task" [ 682.908996] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 682.915988] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52171a8b-754c-8632-2a4d-472445e0432d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 682.968211] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 682.968445] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 682.968633] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Deleting the datastore file [datastore1] e55fdbbb-813d-427c-a53f-5be3fbeeb531 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 682.968885] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-12ab729d-d4dd-4d4f-af30-9b8b070624a3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.974725] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Waiting for the task: (returnval){ [ 682.974725] env[67964]: value = "task-3456724" [ 682.974725] env[67964]: _type = "Task" [ 682.974725] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 682.987485] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Task: {'id': task-3456724, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 683.418924] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 683.419209] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Creating directory with path [datastore1] vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 683.419437] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6d5b53b3-da94-4a33-9150-b5179052307f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.430985] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Created directory with path [datastore1] vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 683.431447] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Fetch image to [datastore1] vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 683.431447] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 683.432132] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83108592-e4d1-4516-8c3d-6ec73e3f8171 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.438806] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e593536-ee9c-4e28-a395-6d3239f5401c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.447792] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95c984e4-7360-441c-b65d-85cf748c7a19 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.481647] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62285895-256e-4391-8b24-914ba65d1eb8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.488671] env[67964]: DEBUG oslo_vmware.api [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Task: {'id': task-3456724, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079469} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 683.491056] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 683.491056] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 683.491056] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 683.491056] env[67964]: INFO nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Took 0.60 seconds to destroy the instance on the hypervisor. [ 683.492428] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-45e55efe-53e2-4b95-9d82-971684b514ca {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.494338] env[67964]: DEBUG nova.compute.claims [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 683.494508] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 683.494714] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 683.529076] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 683.587489] env[67964]: DEBUG oslo_vmware.rw_handles [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 683.648855] env[67964]: DEBUG oslo_vmware.rw_handles [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 683.649069] env[67964]: DEBUG oslo_vmware.rw_handles [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 684.021883] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17788e9c-7bed-4c6e-be4d-e39dab61dcb9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 684.029126] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7b8d6b6-b13f-4bc7-b84d-dfbb40168ad8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 684.059336] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aecb7e43-756c-49f8-995b-d00e8a74de7b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 684.066040] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58385921-da52-479c-b621-165b08d9273f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 684.078481] env[67964]: DEBUG nova.compute.provider_tree [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 684.086941] env[67964]: DEBUG nova.scheduler.client.report [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 684.103712] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.609s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 684.104240] env[67964]: ERROR nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 684.104240] env[67964]: Faults: ['InvalidArgument'] [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Traceback (most recent call last): [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self.driver.spawn(context, instance, image_meta, [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self._vmops.spawn(context, instance, image_meta, injected_files, [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self._fetch_image_if_missing(context, vi) [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] image_cache(vi, tmp_image_ds_loc) [ 684.104240] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] vm_util.copy_virtual_disk( [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] session._wait_for_task(vmdk_copy_task) [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] return self.wait_for_task(task_ref) [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] return evt.wait() [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] result = hub.switch() [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] return self.greenlet.switch() [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 684.104549] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] self.f(*self.args, **self.kw) [ 684.104864] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 684.104864] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] raise exceptions.translate_fault(task_info.error) [ 684.104864] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 684.104864] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Faults: ['InvalidArgument'] [ 684.104864] env[67964]: ERROR nova.compute.manager [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] [ 684.104991] env[67964]: DEBUG nova.compute.utils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 684.106272] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Build of instance e55fdbbb-813d-427c-a53f-5be3fbeeb531 was re-scheduled: A specified parameter was not correct: fileType [ 684.106272] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 684.106643] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 684.106809] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 684.106959] env[67964]: DEBUG nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 684.107132] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 684.498526] env[67964]: DEBUG nova.network.neutron [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 684.509863] env[67964]: INFO nova.compute.manager [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] [instance: e55fdbbb-813d-427c-a53f-5be3fbeeb531] Took 0.40 seconds to deallocate network for instance. [ 684.620198] env[67964]: INFO nova.scheduler.client.report [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Deleted allocations for instance e55fdbbb-813d-427c-a53f-5be3fbeeb531 [ 684.650279] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a7362511-6f67-465d-a719-2d25d9d13d08 tempest-TenantUsagesTestJSON-698674808 tempest-TenantUsagesTestJSON-698674808-project-member] Lock "e55fdbbb-813d-427c-a53f-5be3fbeeb531" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 104.635s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 684.665691] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 684.724214] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 684.724214] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 684.725725] env[67964]: INFO nova.compute.claims [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 685.161978] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a7171d5-34b9-4318-b9a6-36efe3438095 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.171188] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b11cff10-5863-4fe2-bae4-62785e226fd8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.201285] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b9fa807-9ace-40ad-abed-e44abed7153f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.208722] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-792901a6-a12e-4020-b8a1-a5ac4339a11d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.221169] env[67964]: DEBUG nova.compute.provider_tree [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 685.229813] env[67964]: DEBUG nova.scheduler.client.report [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 685.246080] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.522s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 685.246510] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 685.278665] env[67964]: DEBUG nova.compute.utils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 685.280287] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 685.280444] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 685.289078] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 685.350541] env[67964]: DEBUG nova.policy [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56fe6806253b4836b1ab6108bf78717f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '83a096cc635442ba971124ce656fd0eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 685.355325] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 685.380837] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 685.381128] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 685.381313] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 685.381749] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 685.381749] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 685.381749] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 685.381953] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 685.382159] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 685.382340] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 685.382502] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 685.382832] env[67964]: DEBUG nova.virt.hardware [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 685.383535] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b4aa5a1-b692-4631-bf7d-e38b8a3a75e8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.391385] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12c311f9-bc4f-4202-b50c-96e35e5be154 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.648887] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Successfully created port: f0633c1c-a372-4494-af4d-8b5bc585a367 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 686.510459] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Successfully updated port: f0633c1c-a372-4494-af4d-8b5bc585a367 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 686.529629] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "refresh_cache-fed6991c-9b59-43bb-8cda-96053adb798b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 686.531260] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquired lock "refresh_cache-fed6991c-9b59-43bb-8cda-96053adb798b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 686.531506] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 686.596661] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 687.057203] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Updating instance_info_cache with network_info: [{"id": "f0633c1c-a372-4494-af4d-8b5bc585a367", "address": "fa:16:3e:d5:6e:2e", "network": {"id": "24043a77-7b6b-4e9f-8697-18fa4bca6409", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1000267223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83a096cc635442ba971124ce656fd0eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eb2eaecd-9701-4504-9fcb-fb1a420ead72", "external-id": "nsx-vlan-transportzone-433", "segmentation_id": 433, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf0633c1c-a3", "ovs_interfaceid": "f0633c1c-a372-4494-af4d-8b5bc585a367", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.065586] env[67964]: DEBUG nova.compute.manager [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Received event network-vif-plugged-f0633c1c-a372-4494-af4d-8b5bc585a367 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 687.065820] env[67964]: DEBUG oslo_concurrency.lockutils [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] Acquiring lock "fed6991c-9b59-43bb-8cda-96053adb798b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 687.066150] env[67964]: DEBUG oslo_concurrency.lockutils [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] Lock "fed6991c-9b59-43bb-8cda-96053adb798b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 687.066221] env[67964]: DEBUG oslo_concurrency.lockutils [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] Lock "fed6991c-9b59-43bb-8cda-96053adb798b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 687.066820] env[67964]: DEBUG nova.compute.manager [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] No waiting events found dispatching network-vif-plugged-f0633c1c-a372-4494-af4d-8b5bc585a367 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 687.066820] env[67964]: WARNING nova.compute.manager [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Received unexpected event network-vif-plugged-f0633c1c-a372-4494-af4d-8b5bc585a367 for instance with vm_state building and task_state spawning. [ 687.066820] env[67964]: DEBUG nova.compute.manager [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Received event network-changed-f0633c1c-a372-4494-af4d-8b5bc585a367 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 687.066820] env[67964]: DEBUG nova.compute.manager [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Refreshing instance network info cache due to event network-changed-f0633c1c-a372-4494-af4d-8b5bc585a367. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 687.066997] env[67964]: DEBUG oslo_concurrency.lockutils [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] Acquiring lock "refresh_cache-fed6991c-9b59-43bb-8cda-96053adb798b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 687.069366] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Releasing lock "refresh_cache-fed6991c-9b59-43bb-8cda-96053adb798b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 687.069634] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Instance network_info: |[{"id": "f0633c1c-a372-4494-af4d-8b5bc585a367", "address": "fa:16:3e:d5:6e:2e", "network": {"id": "24043a77-7b6b-4e9f-8697-18fa4bca6409", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1000267223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83a096cc635442ba971124ce656fd0eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eb2eaecd-9701-4504-9fcb-fb1a420ead72", "external-id": "nsx-vlan-transportzone-433", "segmentation_id": 433, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf0633c1c-a3", "ovs_interfaceid": "f0633c1c-a372-4494-af4d-8b5bc585a367", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 687.069886] env[67964]: DEBUG oslo_concurrency.lockutils [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] Acquired lock "refresh_cache-fed6991c-9b59-43bb-8cda-96053adb798b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 687.070533] env[67964]: DEBUG nova.network.neutron [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Refreshing network info cache for port f0633c1c-a372-4494-af4d-8b5bc585a367 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 687.071344] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d5:6e:2e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eb2eaecd-9701-4504-9fcb-fb1a420ead72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f0633c1c-a372-4494-af4d-8b5bc585a367', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 687.078659] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Creating folder: Project (83a096cc635442ba971124ce656fd0eb). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 687.080191] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d45a2fd6-2064-4b95-b21e-b2a9b64a5473 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.093394] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Created folder: Project (83a096cc635442ba971124ce656fd0eb) in parent group-v690366. [ 687.094775] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Creating folder: Instances. Parent ref: group-v690404. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 687.094775] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-84835781-24af-4631-8a09-b5f10bacec53 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.101911] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Created folder: Instances in parent group-v690404. [ 687.102156] env[67964]: DEBUG oslo.service.loopingcall [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 687.102397] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 687.102610] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-823361c1-33ef-4d4d-8031-6748bdd5066a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.124099] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 687.124099] env[67964]: value = "task-3456727" [ 687.124099] env[67964]: _type = "Task" [ 687.124099] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 687.133846] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456727, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 687.508946] env[67964]: DEBUG nova.network.neutron [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Updated VIF entry in instance network info cache for port f0633c1c-a372-4494-af4d-8b5bc585a367. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 687.509328] env[67964]: DEBUG nova.network.neutron [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Updating instance_info_cache with network_info: [{"id": "f0633c1c-a372-4494-af4d-8b5bc585a367", "address": "fa:16:3e:d5:6e:2e", "network": {"id": "24043a77-7b6b-4e9f-8697-18fa4bca6409", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1000267223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83a096cc635442ba971124ce656fd0eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eb2eaecd-9701-4504-9fcb-fb1a420ead72", "external-id": "nsx-vlan-transportzone-433", "segmentation_id": 433, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf0633c1c-a3", "ovs_interfaceid": "f0633c1c-a372-4494-af4d-8b5bc585a367", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.522700] env[67964]: DEBUG oslo_concurrency.lockutils [req-a1d199e0-9f12-4291-b3b4-7bfa590c3fa6 req-d20e9a96-d624-4f2f-ab52-609f9521daa7 service nova] Releasing lock "refresh_cache-fed6991c-9b59-43bb-8cda-96053adb798b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 687.634284] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456727, 'name': CreateVM_Task, 'duration_secs': 0.315426} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 687.634464] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 687.635123] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 687.635291] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 687.635593] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 687.635842] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aed9df9b-f38b-48ec-ae11-3cd8cf64452f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.641163] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Waiting for the task: (returnval){ [ 687.641163] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5256ba2d-e24c-acdf-7bf3-63c27eddd128" [ 687.641163] env[67964]: _type = "Task" [ 687.641163] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 687.649312] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5256ba2d-e24c-acdf-7bf3-63c27eddd128, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 688.153050] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 688.153050] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 688.153050] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 713.802628] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 713.802628] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 713.813024] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 713.813411] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 713.813732] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 713.814334] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 713.815129] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66964c27-e69f-4548-aea9-4eb72f109928 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.824176] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09a86810-9378-4ac9-9eb9-516ceaf5a7de {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.839229] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fb47e04-858c-4ed4-beb9-0fb1943d926e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.845430] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d6ef3d7-f1ab-464d-abb4-ed14bea40887 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.874063] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180904MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 713.874275] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 713.874488] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 713.963739] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.963901] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 371aeb17-ad59-4a01-88f7-466dfee8d293 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.964038] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 82096302-bbdd-49b4-bd19-bdf75343e03a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.964249] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bd297ef0-fa45-43c1-ab4e-14bcce806b35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.964375] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.964489] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 180338df-2738-4eeb-8610-cb130d04f6d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.964601] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.964711] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.964876] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.965013] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 713.975957] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 713.985887] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 713.995761] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.005197] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 38541c24-fc6b-4385-91bf-de25df66a798 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.014062] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 068b288c-194b-4d2c-89c3-8adb7d628cc7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.022998] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 834fdc0b-5b2f-4374-a77a-de970c10e125 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.031580] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f50fc747-d3b4-456c-b86f-a086a7968329 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.041467] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f9e00f5a-036a-4141-b4ae-bda4c8a4c11b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.050396] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 188890b5-2189-4499-9856-22dc65b6c6f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.059189] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d84f0e97-24d3-4b0b-8eff-51cf6bfd980c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.067922] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f963e0c6-8a3d-4872-8cae-07fff845b77f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.077590] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8028a7dd-4002-4db3-a738-3926c1d2340e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.086494] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 743b56db-49a0-4af7-96bc-a3fc6025fa19 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.095890] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 45e517a6-1ef1-4082-b5f9-24a9c932630c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.104945] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5b7a605b-7521-40c3-92d1-ce5487f6fedd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.114310] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.125662] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance b9a04994-804b-47b8-bc9f-cf4f18f27f5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.136587] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 658d81c6-dd54-4af5-b51f-6b0ce8fb9336 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.147269] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9dc1cdba-3991-4ee2-b92b-2800e17f07a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.156939] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9f51b0ff-a0fc-4a94-9d1e-578347c2f776 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.170019] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b5afa56-a56a-4990-9ba7-2c0955579a65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.181011] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance dcb0d0da-a987-46b2-be64-672ee3200eab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.190740] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2be271b8-775a-4c51-aa27-75a6a29e270b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 714.190740] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 714.190893] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 714.578657] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e2881ee-c184-43e6-888c-30bf2fd5961b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.585919] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-759ed11c-8768-4168-b0c8-5ea45e6aa2ef {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.615169] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5cc0007-6ef9-4ab9-a29f-0dcf52f6e20e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.622067] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23e1714e-cba5-45eb-a406-12ca15ef6732 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.634922] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.646615] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 714.660356] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 714.660451] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.786s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 715.659009] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 715.659358] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 715.659500] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 715.659608] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 715.801048] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 716.800262] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 717.795582] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 717.800295] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 717.800558] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 717.800670] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 717.821028] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821028] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821028] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821180] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821319] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821450] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821571] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821688] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821805] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.821920] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 717.822048] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 729.265535] env[67964]: WARNING oslo_vmware.rw_handles [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 729.265535] env[67964]: ERROR oslo_vmware.rw_handles [ 729.266133] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 729.267465] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 729.267712] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Copying Virtual Disk [datastore1] vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/95a01ccf-1705-4a40-9ef5-f288a792b3e8/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 729.267996] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-57959116-e418-432f-838a-0278b36dc4d1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.276592] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Waiting for the task: (returnval){ [ 729.276592] env[67964]: value = "task-3456728" [ 729.276592] env[67964]: _type = "Task" [ 729.276592] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 729.284345] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Task: {'id': task-3456728, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 729.787942] env[67964]: DEBUG oslo_vmware.exceptions [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 729.791435] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 729.791435] env[67964]: ERROR nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.791435] env[67964]: Faults: ['InvalidArgument'] [ 729.791435] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Traceback (most recent call last): [ 729.791435] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 729.791435] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] yield resources [ 729.791435] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 729.791435] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self.driver.spawn(context, instance, image_meta, [ 729.791435] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 729.791435] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self._fetch_image_if_missing(context, vi) [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] image_cache(vi, tmp_image_ds_loc) [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] vm_util.copy_virtual_disk( [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] session._wait_for_task(vmdk_copy_task) [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] return self.wait_for_task(task_ref) [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] return evt.wait() [ 729.791742] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] result = hub.switch() [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] return self.greenlet.switch() [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self.f(*self.args, **self.kw) [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] raise exceptions.translate_fault(task_info.error) [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Faults: ['InvalidArgument'] [ 729.792153] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] [ 729.792153] env[67964]: INFO nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Terminating instance [ 729.792404] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 729.792404] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 729.792404] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ee75677-7a35-4f5a-b4c3-f473321738c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.795130] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 729.795464] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 729.796310] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7e6d486-8ee3-45c1-adbe-91771f72eb97 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.804053] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 729.804053] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-24f3ddfd-e7fb-4d1f-af76-1cc533277220 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.815029] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 729.815230] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 729.816222] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-59a3ed4d-1059-4ae1-97de-33517c3cc25e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.821471] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for the task: (returnval){ [ 729.821471] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]528db653-e890-adca-9db8-2eec529adf03" [ 729.821471] env[67964]: _type = "Task" [ 729.821471] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 729.833195] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]528db653-e890-adca-9db8-2eec529adf03, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 729.874019] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 729.874019] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 729.874019] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Deleting the datastore file [datastore1] bd297ef0-fa45-43c1-ab4e-14bcce806b35 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 729.874019] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e16853bc-b147-42d8-9823-05331207ba90 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.879669] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Waiting for the task: (returnval){ [ 729.879669] env[67964]: value = "task-3456730" [ 729.879669] env[67964]: _type = "Task" [ 729.879669] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 729.888039] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Task: {'id': task-3456730, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.332708] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 730.333012] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Creating directory with path [datastore1] vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 730.333231] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-08d51420-a804-4ed9-bca2-ad3bfac2edd0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.345812] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Created directory with path [datastore1] vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 730.346020] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Fetch image to [datastore1] vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 730.346198] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 730.346979] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ce0592-f1a3-4ccc-9f39-26c0b8b3ec58 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.356030] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3646cdac-44e2-43e0-9cee-728c90642ee6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.364282] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3e695d9-a2de-44e4-a7f7-c463e788a9c7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.398339] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e794fee1-ef89-4f36-aef3-43e5b2866c60 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.405992] env[67964]: DEBUG oslo_vmware.api [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Task: {'id': task-3456730, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06812} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 730.407772] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 730.407929] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 730.408118] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.408305] env[67964]: INFO nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Took 0.61 seconds to destroy the instance on the hypervisor. [ 730.410142] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7d1a9ff7-7c76-44e8-a63d-9c2415a87e62 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.412308] env[67964]: DEBUG nova.compute.claims [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 730.412479] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.412685] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.435462] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 730.496249] env[67964]: DEBUG oslo_vmware.rw_handles [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 730.556940] env[67964]: DEBUG oslo_vmware.rw_handles [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 730.557150] env[67964]: DEBUG oslo_vmware.rw_handles [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 730.869465] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-355b006c-9278-48d4-94cf-ef53e1197dee {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.877266] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13cdb267-01ea-4fd8-a421-b6977bc0e381 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.906526] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b24b878c-f084-41cb-bed4-69163b3aa7db {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.917240] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4eaf2cf-06f6-43db-be1e-de80f59f66c3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.930480] env[67964]: DEBUG nova.compute.provider_tree [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.939639] env[67964]: DEBUG nova.scheduler.client.report [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 730.953211] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.540s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 730.953749] env[67964]: ERROR nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.953749] env[67964]: Faults: ['InvalidArgument'] [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Traceback (most recent call last): [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self.driver.spawn(context, instance, image_meta, [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self._fetch_image_if_missing(context, vi) [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] image_cache(vi, tmp_image_ds_loc) [ 730.953749] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] vm_util.copy_virtual_disk( [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] session._wait_for_task(vmdk_copy_task) [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] return self.wait_for_task(task_ref) [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] return evt.wait() [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] result = hub.switch() [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] return self.greenlet.switch() [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 730.954132] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] self.f(*self.args, **self.kw) [ 730.954425] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 730.954425] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] raise exceptions.translate_fault(task_info.error) [ 730.954425] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.954425] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Faults: ['InvalidArgument'] [ 730.954425] env[67964]: ERROR nova.compute.manager [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] [ 730.954610] env[67964]: DEBUG nova.compute.utils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.956041] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Build of instance bd297ef0-fa45-43c1-ab4e-14bcce806b35 was re-scheduled: A specified parameter was not correct: fileType [ 730.956041] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 730.956413] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 730.956583] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 730.956734] env[67964]: DEBUG nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 730.956889] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.318307] env[67964]: DEBUG nova.network.neutron [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.329705] env[67964]: INFO nova.compute.manager [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] [instance: bd297ef0-fa45-43c1-ab4e-14bcce806b35] Took 0.37 seconds to deallocate network for instance. [ 731.434434] env[67964]: INFO nova.scheduler.client.report [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Deleted allocations for instance bd297ef0-fa45-43c1-ab4e-14bcce806b35 [ 731.463034] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9f05f505-e31a-451a-bc9c-0f2bca225c84 tempest-ServerDiagnosticsTest-1547794489 tempest-ServerDiagnosticsTest-1547794489-project-member] Lock "bd297ef0-fa45-43c1-ab4e-14bcce806b35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 147.561s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 731.486625] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 731.544426] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 731.544682] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.546654] env[67964]: INFO nova.compute.claims [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 731.990092] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cb75d7e-2a7f-4074-b98f-950372ba5f54 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.998242] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aab17e56-d7e3-41fc-9e37-bb8e34de5925 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.028452] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c101f754-fd3d-4c30-97d9-d4643c447727 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.036161] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73346a56-abf4-47bf-b82c-7378b6ebdc78 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.050528] env[67964]: DEBUG nova.compute.provider_tree [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.058711] env[67964]: DEBUG nova.scheduler.client.report [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 732.077553] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.533s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.078080] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 732.114340] env[67964]: DEBUG nova.compute.utils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 732.115832] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 732.116535] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 732.124369] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 732.199181] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 732.200526] env[67964]: DEBUG nova.policy [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee452e4d10aa49a197d087281de0ffe0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c726e030e02486b8e157a72b314a956', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 732.222221] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 732.222460] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 732.222613] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 732.222784] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 732.222921] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 732.223089] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 732.223380] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 732.223564] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 732.225027] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 732.225027] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 732.225027] env[67964]: DEBUG nova.virt.hardware [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 732.225027] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a5878a-6758-4fbd-8471-3248395ee04d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.233629] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d34e9233-bf08-4af9-8efc-ff208431a667 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.822210] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Successfully created port: 77d745ad-8be9-47e7-8e56-b3315e9e7d13 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 733.615291] env[67964]: DEBUG nova.compute.manager [req-46d626f9-39ef-40b0-aac4-98e5733397d3 req-50719699-b67e-40e4-a685-75d005ead55c service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Received event network-vif-plugged-77d745ad-8be9-47e7-8e56-b3315e9e7d13 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 733.615291] env[67964]: DEBUG oslo_concurrency.lockutils [req-46d626f9-39ef-40b0-aac4-98e5733397d3 req-50719699-b67e-40e4-a685-75d005ead55c service nova] Acquiring lock "707828f6-0267-42ff-95e5-6b328382b017-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 733.615291] env[67964]: DEBUG oslo_concurrency.lockutils [req-46d626f9-39ef-40b0-aac4-98e5733397d3 req-50719699-b67e-40e4-a685-75d005ead55c service nova] Lock "707828f6-0267-42ff-95e5-6b328382b017-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 733.615291] env[67964]: DEBUG oslo_concurrency.lockutils [req-46d626f9-39ef-40b0-aac4-98e5733397d3 req-50719699-b67e-40e4-a685-75d005ead55c service nova] Lock "707828f6-0267-42ff-95e5-6b328382b017-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 733.615400] env[67964]: DEBUG nova.compute.manager [req-46d626f9-39ef-40b0-aac4-98e5733397d3 req-50719699-b67e-40e4-a685-75d005ead55c service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] No waiting events found dispatching network-vif-plugged-77d745ad-8be9-47e7-8e56-b3315e9e7d13 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 733.615612] env[67964]: WARNING nova.compute.manager [req-46d626f9-39ef-40b0-aac4-98e5733397d3 req-50719699-b67e-40e4-a685-75d005ead55c service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Received unexpected event network-vif-plugged-77d745ad-8be9-47e7-8e56-b3315e9e7d13 for instance with vm_state building and task_state spawning. [ 733.627366] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Successfully updated port: 77d745ad-8be9-47e7-8e56-b3315e9e7d13 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 733.647033] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "refresh_cache-707828f6-0267-42ff-95e5-6b328382b017" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 733.647033] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquired lock "refresh_cache-707828f6-0267-42ff-95e5-6b328382b017" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 733.647033] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.695050] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 733.937114] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Updating instance_info_cache with network_info: [{"id": "77d745ad-8be9-47e7-8e56-b3315e9e7d13", "address": "fa:16:3e:b0:c3:13", "network": {"id": "0d29cca7-4e3c-429a-8bcf-a4bcc8fc5767", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-805855155-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c726e030e02486b8e157a72b314a956", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "099fe970-c61f-4480-bed4-ae4f485fd82a", "external-id": "nsx-vlan-transportzone-678", "segmentation_id": 678, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77d745ad-8b", "ovs_interfaceid": "77d745ad-8be9-47e7-8e56-b3315e9e7d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.961269] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Releasing lock "refresh_cache-707828f6-0267-42ff-95e5-6b328382b017" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 733.962521] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Instance network_info: |[{"id": "77d745ad-8be9-47e7-8e56-b3315e9e7d13", "address": "fa:16:3e:b0:c3:13", "network": {"id": "0d29cca7-4e3c-429a-8bcf-a4bcc8fc5767", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-805855155-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c726e030e02486b8e157a72b314a956", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "099fe970-c61f-4480-bed4-ae4f485fd82a", "external-id": "nsx-vlan-transportzone-678", "segmentation_id": 678, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77d745ad-8b", "ovs_interfaceid": "77d745ad-8be9-47e7-8e56-b3315e9e7d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 733.962836] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b0:c3:13', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '099fe970-c61f-4480-bed4-ae4f485fd82a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '77d745ad-8be9-47e7-8e56-b3315e9e7d13', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 733.970075] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Creating folder: Project (8c726e030e02486b8e157a72b314a956). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 733.970649] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-de2414ec-c520-454e-a32b-029074e49cc9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.980908] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Created folder: Project (8c726e030e02486b8e157a72b314a956) in parent group-v690366. [ 733.981094] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Creating folder: Instances. Parent ref: group-v690407. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 733.981340] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6fa4fe74-73a1-49e9-8bf3-1468bd475490 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.989994] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Created folder: Instances in parent group-v690407. [ 733.990315] env[67964]: DEBUG oslo.service.loopingcall [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 733.990567] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 733.990842] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e4f81f3d-b930-4f91-9319-171680f28606 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.011999] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 734.011999] env[67964]: value = "task-3456733" [ 734.011999] env[67964]: _type = "Task" [ 734.011999] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 734.020065] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456733, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 734.524476] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456733, 'name': CreateVM_Task, 'duration_secs': 0.28024} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 734.524608] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 734.525265] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 734.525425] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 734.525735] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 734.525976] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb8cda3d-250a-4de0-9acf-322bd2eee8b9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.530279] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Waiting for the task: (returnval){ [ 734.530279] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]525937cc-814b-e107-d946-9f7c329c2096" [ 734.530279] env[67964]: _type = "Task" [ 734.530279] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 734.537511] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]525937cc-814b-e107-d946-9f7c329c2096, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 735.039964] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 735.040240] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 735.040466] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 735.767892] env[67964]: DEBUG nova.compute.manager [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Received event network-changed-77d745ad-8be9-47e7-8e56-b3315e9e7d13 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 735.767892] env[67964]: DEBUG nova.compute.manager [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Refreshing instance network info cache due to event network-changed-77d745ad-8be9-47e7-8e56-b3315e9e7d13. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 735.768088] env[67964]: DEBUG oslo_concurrency.lockutils [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] Acquiring lock "refresh_cache-707828f6-0267-42ff-95e5-6b328382b017" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 735.768412] env[67964]: DEBUG oslo_concurrency.lockutils [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] Acquired lock "refresh_cache-707828f6-0267-42ff-95e5-6b328382b017" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 735.768620] env[67964]: DEBUG nova.network.neutron [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Refreshing network info cache for port 77d745ad-8be9-47e7-8e56-b3315e9e7d13 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 736.107226] env[67964]: DEBUG nova.network.neutron [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Updated VIF entry in instance network info cache for port 77d745ad-8be9-47e7-8e56-b3315e9e7d13. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 736.107573] env[67964]: DEBUG nova.network.neutron [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Updating instance_info_cache with network_info: [{"id": "77d745ad-8be9-47e7-8e56-b3315e9e7d13", "address": "fa:16:3e:b0:c3:13", "network": {"id": "0d29cca7-4e3c-429a-8bcf-a4bcc8fc5767", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-805855155-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c726e030e02486b8e157a72b314a956", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "099fe970-c61f-4480-bed4-ae4f485fd82a", "external-id": "nsx-vlan-transportzone-678", "segmentation_id": 678, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77d745ad-8b", "ovs_interfaceid": "77d745ad-8be9-47e7-8e56-b3315e9e7d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.117324] env[67964]: DEBUG oslo_concurrency.lockutils [req-ffc51f21-c7d3-47f4-9b09-c1657fb5e5bf req-d3e7e23d-603d-4103-a17c-666c92bd946a service nova] Releasing lock "refresh_cache-707828f6-0267-42ff-95e5-6b328382b017" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 742.261320] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "ea492fb8-2352-436c-a7d5-f20423f4d353" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.262011] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.800260] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 774.800613] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 774.800712] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 774.800859] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 775.800923] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 775.814523] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 775.814807] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 775.815144] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 775.815230] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 775.816374] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0457343a-39d1-4193-82d7-622e18ab5bae {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.825497] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6918e50f-4452-4d43-9749-4aa861c462c7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.840030] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-162e2b34-a154-4164-96f4-8b122f88f2b6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.846844] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93a104cd-afc3-4848-a97a-8c0986384e91 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.876760] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180886MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 775.876926] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 775.877217] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 775.951035] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.951171] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 371aeb17-ad59-4a01-88f7-466dfee8d293 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.951295] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 82096302-bbdd-49b4-bd19-bdf75343e03a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.951420] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.951565] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 180338df-2738-4eeb-8610-cb130d04f6d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.951686] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.951800] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.951913] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.952037] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.952162] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 775.963633] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 775.976678] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 775.987345] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 38541c24-fc6b-4385-91bf-de25df66a798 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 775.997227] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 068b288c-194b-4d2c-89c3-8adb7d628cc7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.007038] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 834fdc0b-5b2f-4374-a77a-de970c10e125 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.018725] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f50fc747-d3b4-456c-b86f-a086a7968329 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.029373] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f9e00f5a-036a-4141-b4ae-bda4c8a4c11b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.038879] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 188890b5-2189-4499-9856-22dc65b6c6f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.049441] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d84f0e97-24d3-4b0b-8eff-51cf6bfd980c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.059464] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f963e0c6-8a3d-4872-8cae-07fff845b77f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.070508] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8028a7dd-4002-4db3-a738-3926c1d2340e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.081065] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 743b56db-49a0-4af7-96bc-a3fc6025fa19 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.095519] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 45e517a6-1ef1-4082-b5f9-24a9c932630c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.110295] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5b7a605b-7521-40c3-92d1-ce5487f6fedd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.120582] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.130500] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance b9a04994-804b-47b8-bc9f-cf4f18f27f5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.144022] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 658d81c6-dd54-4af5-b51f-6b0ce8fb9336 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.153976] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9dc1cdba-3991-4ee2-b92b-2800e17f07a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.164025] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9f51b0ff-a0fc-4a94-9d1e-578347c2f776 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.177631] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b5afa56-a56a-4990-9ba7-2c0955579a65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.187018] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance dcb0d0da-a987-46b2-be64-672ee3200eab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.197273] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2be271b8-775a-4c51-aa27-75a6a29e270b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.206860] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 776.207121] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 776.207315] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 776.572721] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f9b80d0-d14e-43ea-8617-89e877c39ee1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.580156] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1523b3f3-501f-4c50-9ffa-a0988e894c11 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.610069] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21cbd117-b344-401f-8337-38b9abd9ffd6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.616202] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c8513f-1267-4de2-8a84-2349725bf40b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.629920] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 776.638244] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 776.656152] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 776.656351] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.779s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 777.656561] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 777.795597] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 777.800337] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 777.800575] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 777.800707] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 777.820563] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.820745] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.820858] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.820980] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.821115] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.821236] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.821504] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.821576] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.821675] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.821789] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 777.821908] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 777.822390] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 777.822577] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 778.818088] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 779.585580] env[67964]: WARNING oslo_vmware.rw_handles [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 779.585580] env[67964]: ERROR oslo_vmware.rw_handles [ 779.586084] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 779.587606] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 779.587854] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Copying Virtual Disk [datastore1] vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/83a1a9fe-3108-453b-aac9-eb780bc393c7/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 779.588159] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bb70ebc2-1bae-4f93-bb0a-b236c3d8f68e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.596084] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for the task: (returnval){ [ 779.596084] env[67964]: value = "task-3456734" [ 779.596084] env[67964]: _type = "Task" [ 779.596084] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 779.603683] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': task-3456734, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 780.107015] env[67964]: DEBUG oslo_vmware.exceptions [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 780.107335] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 780.107898] env[67964]: ERROR nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 780.107898] env[67964]: Faults: ['InvalidArgument'] [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Traceback (most recent call last): [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] yield resources [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self.driver.spawn(context, instance, image_meta, [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self._fetch_image_if_missing(context, vi) [ 780.107898] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] image_cache(vi, tmp_image_ds_loc) [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] vm_util.copy_virtual_disk( [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] session._wait_for_task(vmdk_copy_task) [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] return self.wait_for_task(task_ref) [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] return evt.wait() [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] result = hub.switch() [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 780.108218] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] return self.greenlet.switch() [ 780.108640] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 780.108640] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self.f(*self.args, **self.kw) [ 780.108640] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 780.108640] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] raise exceptions.translate_fault(task_info.error) [ 780.108640] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 780.108640] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Faults: ['InvalidArgument'] [ 780.108640] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] [ 780.108640] env[67964]: INFO nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Terminating instance [ 780.109754] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 780.109897] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 780.110155] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3e30218d-92b6-4644-b8f8-2a00251cc23d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.113130] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 780.113330] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 780.114068] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d956e4d-d1bf-46af-9564-97f0e4074d8e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.120877] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 780.121098] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-23666552-b70c-47fa-8c0c-d9badc96680d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.123286] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 780.123455] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 780.124409] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5a0985e8-589c-44ad-b024-6bfe181c4fa9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.129224] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Waiting for the task: (returnval){ [ 780.129224] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5222c82d-3a35-9953-8797-cb0c4bfb5e9b" [ 780.129224] env[67964]: _type = "Task" [ 780.129224] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 780.140667] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5222c82d-3a35-9953-8797-cb0c4bfb5e9b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 780.183311] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 780.183531] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 780.183710] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Deleting the datastore file [datastore1] 82096302-bbdd-49b4-bd19-bdf75343e03a {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 780.183980] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-76fcd849-061c-467d-94e1-90c9087b459f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.190948] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for the task: (returnval){ [ 780.190948] env[67964]: value = "task-3456736" [ 780.190948] env[67964]: _type = "Task" [ 780.190948] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 780.198630] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': task-3456736, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 780.640198] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 780.640420] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Creating directory with path [datastore1] vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 780.640699] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c27ebf55-58f6-4c61-a477-e50db9c5dbab {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.652611] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Created directory with path [datastore1] vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 780.652805] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Fetch image to [datastore1] vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 780.652964] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 780.653680] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21bbfff4-56cc-4e74-b5b8-bf6ae4a1da62 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.659993] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c3ebb35-5f5a-4f5a-a508-8db262cdc072 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.668908] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4555396d-fbbf-4a11-ba99-0fe85456853e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.701731] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-877ad019-0083-4bd8-9621-92cf9e4d47b9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.708355] env[67964]: DEBUG oslo_vmware.api [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': task-3456736, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074759} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 780.709794] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 780.709982] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 780.710163] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 780.710334] env[67964]: INFO nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 780.712164] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-89e3d6ab-99fc-4810-8809-9fd6be122c42 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.713998] env[67964]: DEBUG nova.compute.claims [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 780.714181] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 780.714389] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 780.737432] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 780.789951] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 780.850237] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 780.850431] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 781.165445] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "371aeb17-ad59-4a01-88f7-466dfee8d293" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.184926] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf9edbf5-c9b7-43a0-a790-27547a206d81 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.192744] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-988d7ea8-7cfd-449b-93b5-c424c95bc743 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.222953] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07ed20e5-4b1f-4e68-924b-115f633c01f3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.230598] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e552138-532c-4644-bbf9-ff8bb4792dc6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.244023] env[67964]: DEBUG nova.compute.provider_tree [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 781.252600] env[67964]: DEBUG nova.scheduler.client.report [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 781.269485] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.555s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 781.270059] env[67964]: ERROR nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 781.270059] env[67964]: Faults: ['InvalidArgument'] [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Traceback (most recent call last): [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self.driver.spawn(context, instance, image_meta, [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self._fetch_image_if_missing(context, vi) [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] image_cache(vi, tmp_image_ds_loc) [ 781.270059] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] vm_util.copy_virtual_disk( [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] session._wait_for_task(vmdk_copy_task) [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] return self.wait_for_task(task_ref) [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] return evt.wait() [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] result = hub.switch() [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] return self.greenlet.switch() [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 781.270366] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] self.f(*self.args, **self.kw) [ 781.270701] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 781.270701] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] raise exceptions.translate_fault(task_info.error) [ 781.270701] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 781.270701] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Faults: ['InvalidArgument'] [ 781.270701] env[67964]: ERROR nova.compute.manager [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] [ 781.270823] env[67964]: DEBUG nova.compute.utils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 781.272199] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Build of instance 82096302-bbdd-49b4-bd19-bdf75343e03a was re-scheduled: A specified parameter was not correct: fileType [ 781.272199] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 781.272580] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 781.272792] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 781.272902] env[67964]: DEBUG nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 781.273075] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 781.402753] env[67964]: DEBUG oslo_concurrency.lockutils [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "93509103-8c02-420d-bcaa-c2cf0847b1f0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.600404] env[67964]: DEBUG nova.network.neutron [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 781.613307] env[67964]: INFO nova.compute.manager [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: 82096302-bbdd-49b4-bd19-bdf75343e03a] Took 0.34 seconds to deallocate network for instance. [ 781.721105] env[67964]: INFO nova.scheduler.client.report [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Deleted allocations for instance 82096302-bbdd-49b4-bd19-bdf75343e03a [ 781.741899] env[67964]: DEBUG oslo_concurrency.lockutils [None req-33a3a45f-5cdd-4496-8e49-c8e22c8e9e73 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "82096302-bbdd-49b4-bd19-bdf75343e03a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.927s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 781.761949] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 781.823395] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.823652] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 781.825077] env[67964]: INFO nova.compute.claims [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 782.231824] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9102d4b4-eb50-490c-ac62-f4d2c9ee03ed {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.239631] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d28259c5-e4b0-45ac-a4d8-202fbd0003b0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.272026] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84327d86-90c2-4241-8d7a-8be7316dbd87 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.277314] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8af3302c-c096-432b-87c7-26e8c6ace8cb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.289962] env[67964]: DEBUG nova.compute.provider_tree [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 782.298656] env[67964]: DEBUG nova.scheduler.client.report [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 782.311252] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.488s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 782.311755] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 782.343619] env[67964]: DEBUG nova.compute.utils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 782.345361] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 782.345361] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 782.353875] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 782.417262] env[67964]: DEBUG nova.policy [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eff4126d474d4a1ba245e687830ab8e7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ddfb5b3af37495f80bb33263d56940b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 782.420647] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 782.446811] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 782.447076] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 782.447245] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 782.447456] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 782.447607] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 782.447753] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 782.447962] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 782.448133] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 782.448300] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 782.448464] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 782.448636] env[67964]: DEBUG nova.virt.hardware [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 782.449498] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b6eb2b4-315d-4446-83b3-82ba51c06648 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.457759] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0cdb23b-96f1-40ab-a283-ba53ebceda36 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.967148] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Successfully created port: 536e4923-beaf-4731-a683-58223cb45e85 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 783.070809] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.071071] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 783.741437] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Successfully updated port: 536e4923-beaf-4731-a683-58223cb45e85 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 783.757140] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "refresh_cache-0768fe80-7dd3-42ec-8e22-42a6aece5bef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 783.757140] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquired lock "refresh_cache-0768fe80-7dd3-42ec-8e22-42a6aece5bef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 783.757140] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 783.824203] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 784.068472] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Updating instance_info_cache with network_info: [{"id": "536e4923-beaf-4731-a683-58223cb45e85", "address": "fa:16:3e:37:b2:02", "network": {"id": "6329b057-bc56-43f3-aa35-8426121d0220", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1982945155-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6ddfb5b3af37495f80bb33263d56940b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0d7a2b2f-3b49-4dc8-9096-af16144b27a9", "external-id": "nsx-vlan-transportzone-492", "segmentation_id": 492, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap536e4923-be", "ovs_interfaceid": "536e4923-beaf-4731-a683-58223cb45e85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 784.080340] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Releasing lock "refresh_cache-0768fe80-7dd3-42ec-8e22-42a6aece5bef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 784.080667] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Instance network_info: |[{"id": "536e4923-beaf-4731-a683-58223cb45e85", "address": "fa:16:3e:37:b2:02", "network": {"id": "6329b057-bc56-43f3-aa35-8426121d0220", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1982945155-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6ddfb5b3af37495f80bb33263d56940b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0d7a2b2f-3b49-4dc8-9096-af16144b27a9", "external-id": "nsx-vlan-transportzone-492", "segmentation_id": 492, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap536e4923-be", "ovs_interfaceid": "536e4923-beaf-4731-a683-58223cb45e85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 784.081084] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:37:b2:02', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0d7a2b2f-3b49-4dc8-9096-af16144b27a9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '536e4923-beaf-4731-a683-58223cb45e85', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 784.088534] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Creating folder: Project (6ddfb5b3af37495f80bb33263d56940b). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 784.089115] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cc3f5994-4cc8-4884-95e5-6bd0e24bbdc1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.100749] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Created folder: Project (6ddfb5b3af37495f80bb33263d56940b) in parent group-v690366. [ 784.100975] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Creating folder: Instances. Parent ref: group-v690410. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 784.101253] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7331abea-89ff-482d-a5f1-90d2b9ff6cfd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.110440] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Created folder: Instances in parent group-v690410. [ 784.110778] env[67964]: DEBUG oslo.service.loopingcall [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 784.110905] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 784.111111] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-59180909-4a99-46ee-985f-470fb71580cb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.128490] env[67964]: DEBUG nova.compute.manager [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Received event network-vif-plugged-536e4923-beaf-4731-a683-58223cb45e85 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 784.128490] env[67964]: DEBUG oslo_concurrency.lockutils [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] Acquiring lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 784.128490] env[67964]: DEBUG oslo_concurrency.lockutils [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 784.128490] env[67964]: DEBUG oslo_concurrency.lockutils [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 784.128637] env[67964]: DEBUG nova.compute.manager [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] No waiting events found dispatching network-vif-plugged-536e4923-beaf-4731-a683-58223cb45e85 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 784.128637] env[67964]: WARNING nova.compute.manager [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Received unexpected event network-vif-plugged-536e4923-beaf-4731-a683-58223cb45e85 for instance with vm_state building and task_state spawning. [ 784.128637] env[67964]: DEBUG nova.compute.manager [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Received event network-changed-536e4923-beaf-4731-a683-58223cb45e85 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 784.128637] env[67964]: DEBUG nova.compute.manager [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Refreshing instance network info cache due to event network-changed-536e4923-beaf-4731-a683-58223cb45e85. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 784.128761] env[67964]: DEBUG oslo_concurrency.lockutils [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] Acquiring lock "refresh_cache-0768fe80-7dd3-42ec-8e22-42a6aece5bef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 784.128893] env[67964]: DEBUG oslo_concurrency.lockutils [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] Acquired lock "refresh_cache-0768fe80-7dd3-42ec-8e22-42a6aece5bef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 784.129110] env[67964]: DEBUG nova.network.neutron [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Refreshing network info cache for port 536e4923-beaf-4731-a683-58223cb45e85 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 784.134227] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 784.134227] env[67964]: value = "task-3456739" [ 784.134227] env[67964]: _type = "Task" [ 784.134227] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 784.141715] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456739, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 784.450752] env[67964]: DEBUG nova.network.neutron [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Updated VIF entry in instance network info cache for port 536e4923-beaf-4731-a683-58223cb45e85. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 784.451129] env[67964]: DEBUG nova.network.neutron [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Updating instance_info_cache with network_info: [{"id": "536e4923-beaf-4731-a683-58223cb45e85", "address": "fa:16:3e:37:b2:02", "network": {"id": "6329b057-bc56-43f3-aa35-8426121d0220", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1982945155-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6ddfb5b3af37495f80bb33263d56940b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0d7a2b2f-3b49-4dc8-9096-af16144b27a9", "external-id": "nsx-vlan-transportzone-492", "segmentation_id": 492, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap536e4923-be", "ovs_interfaceid": "536e4923-beaf-4731-a683-58223cb45e85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 784.460956] env[67964]: DEBUG oslo_concurrency.lockutils [req-ff48cc6a-e821-4929-850e-4d744e7f45d7 req-ebd53b58-315d-4c36-97c7-72933adc8491 service nova] Releasing lock "refresh_cache-0768fe80-7dd3-42ec-8e22-42a6aece5bef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 784.643931] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456739, 'name': CreateVM_Task, 'duration_secs': 0.292142} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 784.644116] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 784.644784] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 784.644946] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 784.645269] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 784.645507] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d566a7eb-0ed6-4384-8a31-ca3dd65ca3e1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.650027] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Waiting for the task: (returnval){ [ 784.650027] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]523623e6-1bd9-0650-17db-b15223824c12" [ 784.650027] env[67964]: _type = "Task" [ 784.650027] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 784.657727] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]523623e6-1bd9-0650-17db-b15223824c12, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 785.163158] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 785.163158] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 785.163158] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 785.868798] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.965590] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "180338df-2738-4eeb-8610-cb130d04f6d2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 789.816307] env[67964]: DEBUG oslo_concurrency.lockutils [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "8b261c6e-741c-4d6c-9567-566af85cd68f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.605090] env[67964]: DEBUG oslo_concurrency.lockutils [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "9c586d33-c563-45c7-8c54-1638a78a669c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 796.685970] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "6580c348-f5a4-4f20-a6fb-8942202a526e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 797.605156] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "fed6991c-9b59-43bb-8cda-96053adb798b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 799.178034] env[67964]: DEBUG oslo_concurrency.lockutils [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "707828f6-0267-42ff-95e5-6b328382b017" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.343540] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.343830] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.800524] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 826.459453] env[67964]: WARNING oslo_vmware.rw_handles [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 826.459453] env[67964]: ERROR oslo_vmware.rw_handles [ 826.459983] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 826.464891] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 826.464891] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Copying Virtual Disk [datastore1] vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/b66c1352-2924-4bb8-a0c9-27bc6a4e7227/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 826.464891] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8a6773e0-5ab6-4b61-8973-bdd317ad672f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 826.471817] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Waiting for the task: (returnval){ [ 826.471817] env[67964]: value = "task-3456740" [ 826.471817] env[67964]: _type = "Task" [ 826.471817] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 826.483762] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Task: {'id': task-3456740, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 826.986825] env[67964]: DEBUG oslo_vmware.exceptions [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 826.987441] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 826.987809] env[67964]: ERROR nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 826.987809] env[67964]: Faults: ['InvalidArgument'] [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Traceback (most recent call last): [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] yield resources [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self.driver.spawn(context, instance, image_meta, [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self._fetch_image_if_missing(context, vi) [ 826.987809] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] image_cache(vi, tmp_image_ds_loc) [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] vm_util.copy_virtual_disk( [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] session._wait_for_task(vmdk_copy_task) [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] return self.wait_for_task(task_ref) [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] return evt.wait() [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] result = hub.switch() [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 826.988360] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] return self.greenlet.switch() [ 826.988807] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 826.988807] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self.f(*self.args, **self.kw) [ 826.988807] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 826.988807] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] raise exceptions.translate_fault(task_info.error) [ 826.988807] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 826.988807] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Faults: ['InvalidArgument'] [ 826.988807] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] [ 826.988807] env[67964]: INFO nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Terminating instance [ 826.991486] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 826.991486] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 826.992182] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 826.992256] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 826.994073] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c9ad7113-35a8-4d26-bc0a-3ca8643f002c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 826.999508] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04b1dba3-a2bc-4522-8f72-10c134d58b20 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.007433] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 827.008610] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4a0fd5d7-323e-47fe-afa4-b739d81f1c1a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.013762] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 827.013762] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 827.013762] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-80c13064-9944-468c-a789-e707663b54e2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.023190] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Waiting for the task: (returnval){ [ 827.023190] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52db0401-d440-e379-8151-c187506c0c3d" [ 827.023190] env[67964]: _type = "Task" [ 827.023190] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 827.034376] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52db0401-d440-e379-8151-c187506c0c3d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 827.338057] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3deb3d74-23c8-49db-845f-18a0428a7b24 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "6c329e27-945e-4996-9994-85d207c35325" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 827.339029] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3deb3d74-23c8-49db-845f-18a0428a7b24 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "6c329e27-945e-4996-9994-85d207c35325" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 827.536417] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 827.537541] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Creating directory with path [datastore1] vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 827.537660] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf3a6dc3-1fde-4283-a03f-8eab91f77ce6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.563356] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Created directory with path [datastore1] vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 827.563356] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Fetch image to [datastore1] vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 827.563524] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 827.564384] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-874a2ed0-b377-4995-972d-72b2acf7acf7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.572373] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4c1d974-c281-4eb5-a08d-0aff807827bc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.583825] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c7fa4d8-07ff-4eec-8414-ccde8131baa8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.617774] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cf958b2-15f6-4cdc-b0cc-8176aa5f8e89 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.624051] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-51a7f16d-908c-4399-b42b-5ea80dfbec7c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.661620] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 827.731877] env[67964]: DEBUG oslo_vmware.rw_handles [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 827.807560] env[67964]: DEBUG oslo_vmware.rw_handles [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 827.807750] env[67964]: DEBUG oslo_vmware.rw_handles [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 828.054626] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 828.054851] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 828.055045] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Deleting the datastore file [datastore1] 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 828.055442] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3d58f19f-8c3d-43dc-9801-210789bbdcbc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.063083] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Waiting for the task: (returnval){ [ 828.063083] env[67964]: value = "task-3456742" [ 828.063083] env[67964]: _type = "Task" [ 828.063083] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 828.073080] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Task: {'id': task-3456742, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.574496] env[67964]: DEBUG oslo_vmware.api [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Task: {'id': task-3456742, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076309} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 828.576141] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 828.576141] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 828.576141] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 828.576141] env[67964]: INFO nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Took 1.58 seconds to destroy the instance on the hypervisor. [ 828.578370] env[67964]: DEBUG nova.compute.claims [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 828.578727] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 828.584081] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 829.049452] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8a123f8-cc5d-4fe9-a6eb-8404192d122d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.060869] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99deb2c2-8b24-4c96-bf27-531a1bdae47d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.096789] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-687aa308-d3ea-49d1-825a-7d76461efd12 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.104771] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4be96c93-d493-43c9-8cf2-daeeb7bdd4a1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.118924] env[67964]: DEBUG nova.compute.provider_tree [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 829.129927] env[67964]: DEBUG nova.scheduler.client.report [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 829.148299] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.568s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 829.148944] env[67964]: ERROR nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 829.148944] env[67964]: Faults: ['InvalidArgument'] [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Traceback (most recent call last): [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self.driver.spawn(context, instance, image_meta, [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self._fetch_image_if_missing(context, vi) [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] image_cache(vi, tmp_image_ds_loc) [ 829.148944] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] vm_util.copy_virtual_disk( [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] session._wait_for_task(vmdk_copy_task) [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] return self.wait_for_task(task_ref) [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] return evt.wait() [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] result = hub.switch() [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] return self.greenlet.switch() [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 829.149520] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] self.f(*self.args, **self.kw) [ 829.149935] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 829.149935] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] raise exceptions.translate_fault(task_info.error) [ 829.149935] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 829.149935] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Faults: ['InvalidArgument'] [ 829.149935] env[67964]: ERROR nova.compute.manager [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] [ 829.149935] env[67964]: DEBUG nova.compute.utils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 829.151460] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Build of instance 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b was re-scheduled: A specified parameter was not correct: fileType [ 829.151460] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 829.152514] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 829.152514] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 829.152514] env[67964]: DEBUG nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 829.152514] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 830.068180] env[67964]: DEBUG nova.network.neutron [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.082044] env[67964]: INFO nova.compute.manager [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Took 0.93 seconds to deallocate network for instance. [ 830.211503] env[67964]: INFO nova.scheduler.client.report [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Deleted allocations for instance 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b [ 830.242334] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c84cc7ce-88a5-4940-904b-0f60d396497b tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 245.027s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.243501] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 44.375s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 830.243725] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 830.243933] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 830.244103] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.251618] env[67964]: INFO nova.compute.manager [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Terminating instance [ 830.252756] env[67964]: DEBUG nova.compute.manager [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 830.252951] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 830.253479] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5200d024-c698-4d16-bc7b-2ae655b02948 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.266392] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc7eab80-079f-49ae-8078-15b14d730746 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.281706] env[67964]: DEBUG nova.compute.manager [None req-e75c5a43-89ad-429d-916e-af1268f0c030 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.305816] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b could not be found. [ 830.306209] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 830.306209] env[67964]: INFO nova.compute.manager [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 830.306759] env[67964]: DEBUG oslo.service.loopingcall [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 830.306759] env[67964]: DEBUG nova.compute.manager [-] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 830.306873] env[67964]: DEBUG nova.network.neutron [-] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 830.319399] env[67964]: DEBUG nova.compute.manager [None req-e75c5a43-89ad-429d-916e-af1268f0c030 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.345747] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e75c5a43-89ad-429d-916e-af1268f0c030 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0ad6ab85-b1d6-479d-85a4-ff8ce5fb26e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.514s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.363104] env[67964]: DEBUG nova.compute.manager [None req-5e85a1ef-d577-48c0-b398-bc3ae3f17bd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 38541c24-fc6b-4385-91bf-de25df66a798] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.383678] env[67964]: DEBUG nova.network.neutron [-] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.403133] env[67964]: DEBUG nova.compute.manager [None req-5e85a1ef-d577-48c0-b398-bc3ae3f17bd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 38541c24-fc6b-4385-91bf-de25df66a798] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.403133] env[67964]: INFO nova.compute.manager [-] [instance: 8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b] Took 0.10 seconds to deallocate network for instance. [ 830.423455] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5e85a1ef-d577-48c0-b398-bc3ae3f17bd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "38541c24-fc6b-4385-91bf-de25df66a798" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.833s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.437447] env[67964]: DEBUG nova.compute.manager [None req-dc4fe31c-7623-462b-8302-75835785f8ac tempest-VolumesAssistedSnapshotsTest-1589554248 tempest-VolumesAssistedSnapshotsTest-1589554248-project-member] [instance: 068b288c-194b-4d2c-89c3-8adb7d628cc7] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.489157] env[67964]: DEBUG nova.compute.manager [None req-dc4fe31c-7623-462b-8302-75835785f8ac tempest-VolumesAssistedSnapshotsTest-1589554248 tempest-VolumesAssistedSnapshotsTest-1589554248-project-member] [instance: 068b288c-194b-4d2c-89c3-8adb7d628cc7] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.513108] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dc4fe31c-7623-462b-8302-75835785f8ac tempest-VolumesAssistedSnapshotsTest-1589554248 tempest-VolumesAssistedSnapshotsTest-1589554248-project-member] Lock "068b288c-194b-4d2c-89c3-8adb7d628cc7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.326s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.526784] env[67964]: DEBUG nova.compute.manager [None req-7d77b253-0fe5-4cc4-9479-7f7d1f381d3d tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 834fdc0b-5b2f-4374-a77a-de970c10e125] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.583908] env[67964]: DEBUG nova.compute.manager [None req-7d77b253-0fe5-4cc4-9479-7f7d1f381d3d tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 834fdc0b-5b2f-4374-a77a-de970c10e125] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.605303] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f432d495-e654-4b79-b945-687d91e57cef tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "8c1f8c1e-b86a-40bc-b3af-335c1ad3d77b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.362s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.622896] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7d77b253-0fe5-4cc4-9479-7f7d1f381d3d tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "834fdc0b-5b2f-4374-a77a-de970c10e125" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.040s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.633643] env[67964]: DEBUG nova.compute.manager [None req-54a71b18-cf24-437d-82bb-b8086dd8588a tempest-ServersAdmin275Test-280874369 tempest-ServersAdmin275Test-280874369-project-member] [instance: f50fc747-d3b4-456c-b86f-a086a7968329] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.663208] env[67964]: DEBUG nova.compute.manager [None req-54a71b18-cf24-437d-82bb-b8086dd8588a tempest-ServersAdmin275Test-280874369 tempest-ServersAdmin275Test-280874369-project-member] [instance: f50fc747-d3b4-456c-b86f-a086a7968329] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.692543] env[67964]: DEBUG oslo_concurrency.lockutils [None req-54a71b18-cf24-437d-82bb-b8086dd8588a tempest-ServersAdmin275Test-280874369 tempest-ServersAdmin275Test-280874369-project-member] Lock "f50fc747-d3b4-456c-b86f-a086a7968329" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.657s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.708072] env[67964]: DEBUG nova.compute.manager [None req-e5250760-6ed9-47eb-8175-6ce343c66ac9 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] [instance: f9e00f5a-036a-4141-b4ae-bda4c8a4c11b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.777849] env[67964]: DEBUG nova.compute.manager [None req-e5250760-6ed9-47eb-8175-6ce343c66ac9 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] [instance: f9e00f5a-036a-4141-b4ae-bda4c8a4c11b] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.811229] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5250760-6ed9-47eb-8175-6ce343c66ac9 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] Lock "f9e00f5a-036a-4141-b4ae-bda4c8a4c11b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.455s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.827540] env[67964]: DEBUG nova.compute.manager [None req-2a789350-41a4-4acc-8b9a-6b83ffe27ae6 tempest-ImagesNegativeTestJSON-124339597 tempest-ImagesNegativeTestJSON-124339597-project-member] [instance: 188890b5-2189-4499-9856-22dc65b6c6f1] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.865703] env[67964]: DEBUG nova.compute.manager [None req-2a789350-41a4-4acc-8b9a-6b83ffe27ae6 tempest-ImagesNegativeTestJSON-124339597 tempest-ImagesNegativeTestJSON-124339597-project-member] [instance: 188890b5-2189-4499-9856-22dc65b6c6f1] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.894896] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2a789350-41a4-4acc-8b9a-6b83ffe27ae6 tempest-ImagesNegativeTestJSON-124339597 tempest-ImagesNegativeTestJSON-124339597-project-member] Lock "188890b5-2189-4499-9856-22dc65b6c6f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.768s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.909198] env[67964]: DEBUG nova.compute.manager [None req-c3274f14-016e-4655-bde3-8f9ae729ec9e tempest-ServersWithSpecificFlavorTestJSON-1528302942 tempest-ServersWithSpecificFlavorTestJSON-1528302942-project-member] [instance: d84f0e97-24d3-4b0b-8eff-51cf6bfd980c] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 830.941891] env[67964]: DEBUG nova.compute.manager [None req-c3274f14-016e-4655-bde3-8f9ae729ec9e tempest-ServersWithSpecificFlavorTestJSON-1528302942 tempest-ServersWithSpecificFlavorTestJSON-1528302942-project-member] [instance: d84f0e97-24d3-4b0b-8eff-51cf6bfd980c] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 830.978444] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3274f14-016e-4655-bde3-8f9ae729ec9e tempest-ServersWithSpecificFlavorTestJSON-1528302942 tempest-ServersWithSpecificFlavorTestJSON-1528302942-project-member] Lock "d84f0e97-24d3-4b0b-8eff-51cf6bfd980c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.898s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.991527] env[67964]: DEBUG nova.compute.manager [None req-51b9fbd5-60e6-4b63-9b0e-7070e6505d41 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] [instance: f963e0c6-8a3d-4872-8cae-07fff845b77f] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 831.026893] env[67964]: DEBUG nova.compute.manager [None req-51b9fbd5-60e6-4b63-9b0e-7070e6505d41 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] [instance: f963e0c6-8a3d-4872-8cae-07fff845b77f] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 831.052669] env[67964]: DEBUG oslo_concurrency.lockutils [None req-51b9fbd5-60e6-4b63-9b0e-7070e6505d41 tempest-ListImageFiltersTestJSON-223070775 tempest-ListImageFiltersTestJSON-223070775-project-member] Lock "f963e0c6-8a3d-4872-8cae-07fff845b77f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.564s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.065314] env[67964]: DEBUG nova.compute.manager [None req-6a025ec9-bcde-4f7b-b88f-736df426b959 tempest-ServersTestBootFromVolume-544690349 tempest-ServersTestBootFromVolume-544690349-project-member] [instance: 8028a7dd-4002-4db3-a738-3926c1d2340e] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 831.096793] env[67964]: DEBUG nova.compute.manager [None req-6a025ec9-bcde-4f7b-b88f-736df426b959 tempest-ServersTestBootFromVolume-544690349 tempest-ServersTestBootFromVolume-544690349-project-member] [instance: 8028a7dd-4002-4db3-a738-3926c1d2340e] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 831.129021] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6a025ec9-bcde-4f7b-b88f-736df426b959 tempest-ServersTestBootFromVolume-544690349 tempest-ServersTestBootFromVolume-544690349-project-member] Lock "8028a7dd-4002-4db3-a738-3926c1d2340e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.540s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.135860] env[67964]: DEBUG nova.compute.manager [None req-9770d661-95f0-420f-aec2-ae8674b60e20 tempest-FloatingIPsAssociationTestJSON-985246946 tempest-FloatingIPsAssociationTestJSON-985246946-project-member] [instance: 743b56db-49a0-4af7-96bc-a3fc6025fa19] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 831.170551] env[67964]: DEBUG nova.compute.manager [None req-9770d661-95f0-420f-aec2-ae8674b60e20 tempest-FloatingIPsAssociationTestJSON-985246946 tempest-FloatingIPsAssociationTestJSON-985246946-project-member] [instance: 743b56db-49a0-4af7-96bc-a3fc6025fa19] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 831.196300] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9770d661-95f0-420f-aec2-ae8674b60e20 tempest-FloatingIPsAssociationTestJSON-985246946 tempest-FloatingIPsAssociationTestJSON-985246946-project-member] Lock "743b56db-49a0-4af7-96bc-a3fc6025fa19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.557s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.208987] env[67964]: DEBUG nova.compute.manager [None req-cd0282c1-c34c-42b8-9f6a-3c6745b2c172 tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] [instance: 45e517a6-1ef1-4082-b5f9-24a9c932630c] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 831.237041] env[67964]: DEBUG nova.compute.manager [None req-cd0282c1-c34c-42b8-9f6a-3c6745b2c172 tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] [instance: 45e517a6-1ef1-4082-b5f9-24a9c932630c] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 831.262599] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cd0282c1-c34c-42b8-9f6a-3c6745b2c172 tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] Lock "45e517a6-1ef1-4082-b5f9-24a9c932630c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.618s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.278978] env[67964]: DEBUG nova.compute.manager [None req-9b478b52-92e0-4f1b-b4f8-de3b304cfe35 tempest-ServerGroupTestJSON-612408333 tempest-ServerGroupTestJSON-612408333-project-member] [instance: 5b7a605b-7521-40c3-92d1-ce5487f6fedd] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 831.312495] env[67964]: DEBUG nova.compute.manager [None req-9b478b52-92e0-4f1b-b4f8-de3b304cfe35 tempest-ServerGroupTestJSON-612408333 tempest-ServerGroupTestJSON-612408333-project-member] [instance: 5b7a605b-7521-40c3-92d1-ce5487f6fedd] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 831.343357] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9b478b52-92e0-4f1b-b4f8-de3b304cfe35 tempest-ServerGroupTestJSON-612408333 tempest-ServerGroupTestJSON-612408333-project-member] Lock "5b7a605b-7521-40c3-92d1-ce5487f6fedd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.733s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.359437] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 831.427192] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 831.427463] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 831.431258] env[67964]: INFO nova.compute.claims [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 831.893896] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c477fdb-1932-4e7b-af33-4545cf8ea206 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.903188] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-786bc34e-b23f-4019-b4f5-4d03bafc2eba {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.940746] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb4700a-f4a8-4fe5-b55c-eca7f0fffdce {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.949648] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eb1f47a-4269-4c1c-9182-e312ff23a990 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.963437] env[67964]: DEBUG nova.compute.provider_tree [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 831.977377] env[67964]: DEBUG nova.scheduler.client.report [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 831.994985] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.567s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.995149] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 832.038215] env[67964]: DEBUG nova.compute.utils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 832.043016] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 832.043016] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 832.059027] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 832.134876] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 832.161257] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 832.161444] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 832.161601] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 832.161780] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 832.161922] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 832.162118] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 832.162341] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 832.162497] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 832.162663] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 832.162963] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 832.163044] env[67964]: DEBUG nova.virt.hardware [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 832.163904] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb087530-aec6-4dd1-83bd-33d06e1c1266 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.172732] env[67964]: DEBUG nova.policy [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '43fd69d272154d44a5ee8f168321b3f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f1b85d829ed64fec8f242ab1f4666cdf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 832.175228] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faf2d103-a466-4d3b-aa11-b308f9c10e47 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.971242] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Successfully created port: c200fbe5-6838-4ca1-b819-327a8bbf5719 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 833.522821] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 833.523372] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 833.799967] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 833.801870] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 833.816403] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] There are 0 instances to clean {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 833.816720] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 833.816841] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances with incomplete migration {{(pid=67964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 833.831882] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 834.677106] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Successfully updated port: c200fbe5-6838-4ca1-b819-327a8bbf5719 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 834.698539] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "refresh_cache-9e47d3ce-3897-458b-ac85-d98745e9aeb5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 834.698677] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquired lock "refresh_cache-9e47d3ce-3897-458b-ac85-d98745e9aeb5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 834.698823] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 834.755990] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 834.842025] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 835.058104] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Updating instance_info_cache with network_info: [{"id": "c200fbe5-6838-4ca1-b819-327a8bbf5719", "address": "fa:16:3e:72:bf:94", "network": {"id": "b1620391-3676-4162-8f61-cd282b73fd16", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1689141491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1b85d829ed64fec8f242ab1f4666cdf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "edd47158-6f4b-44a1-8e82-0411205ad299", "external-id": "nsx-vlan-transportzone-587", "segmentation_id": 587, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc200fbe5-68", "ovs_interfaceid": "c200fbe5-6838-4ca1-b819-327a8bbf5719", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 835.077668] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8a1228d9-e9cd-4b75-a74e-e426aacd4e19 tempest-ServersNegativeTestJSON-1937387395 tempest-ServersNegativeTestJSON-1937387395-project-member] Acquiring lock "09c05646-301a-4d74-957c-1c9c6b7ab44b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.077908] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8a1228d9-e9cd-4b75-a74e-e426aacd4e19 tempest-ServersNegativeTestJSON-1937387395 tempest-ServersNegativeTestJSON-1937387395-project-member] Lock "09c05646-301a-4d74-957c-1c9c6b7ab44b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.089508] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Releasing lock "refresh_cache-9e47d3ce-3897-458b-ac85-d98745e9aeb5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 835.091242] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Instance network_info: |[{"id": "c200fbe5-6838-4ca1-b819-327a8bbf5719", "address": "fa:16:3e:72:bf:94", "network": {"id": "b1620391-3676-4162-8f61-cd282b73fd16", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1689141491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1b85d829ed64fec8f242ab1f4666cdf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "edd47158-6f4b-44a1-8e82-0411205ad299", "external-id": "nsx-vlan-transportzone-587", "segmentation_id": 587, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc200fbe5-68", "ovs_interfaceid": "c200fbe5-6838-4ca1-b819-327a8bbf5719", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 835.091344] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:72:bf:94', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'edd47158-6f4b-44a1-8e82-0411205ad299', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c200fbe5-6838-4ca1-b819-327a8bbf5719', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 835.097846] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Creating folder: Project (f1b85d829ed64fec8f242ab1f4666cdf). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 835.099539] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-94e2bfc7-c74c-4a45-aeb4-c2702d185501 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.110660] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Created folder: Project (f1b85d829ed64fec8f242ab1f4666cdf) in parent group-v690366. [ 835.110884] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Creating folder: Instances. Parent ref: group-v690413. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 835.111140] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e440e8e4-58d4-49bc-8fe1-522d0909f2c1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.122529] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Created folder: Instances in parent group-v690413. [ 835.123101] env[67964]: DEBUG oslo.service.loopingcall [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 835.123101] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 835.123396] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ae088c47-80dd-446c-bd03-3204324b3395 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.148834] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 835.148834] env[67964]: value = "task-3456745" [ 835.148834] env[67964]: _type = "Task" [ 835.148834] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 835.157356] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456745, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 835.255514] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Acquiring lock "f6aedef6-3d4d-4839-863b-771ac818a1c4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.255799] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Lock "f6aedef6-3d4d-4839-863b-771ac818a1c4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.292064] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Acquiring lock "34481f0e-b35a-4405-be54-ac23326f1183" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.292754] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Lock "34481f0e-b35a-4405-be54-ac23326f1183" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.325527] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Acquiring lock "ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.325527] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Lock "ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.341170] env[67964]: DEBUG nova.compute.manager [req-6b69f5ec-e8f6-49c2-9c97-49d20f63628b req-4a4437dc-1e64-4a54-9991-b4c5edd750d4 service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Received event network-vif-plugged-c200fbe5-6838-4ca1-b819-327a8bbf5719 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 835.341399] env[67964]: DEBUG oslo_concurrency.lockutils [req-6b69f5ec-e8f6-49c2-9c97-49d20f63628b req-4a4437dc-1e64-4a54-9991-b4c5edd750d4 service nova] Acquiring lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.341604] env[67964]: DEBUG oslo_concurrency.lockutils [req-6b69f5ec-e8f6-49c2-9c97-49d20f63628b req-4a4437dc-1e64-4a54-9991-b4c5edd750d4 service nova] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.341769] env[67964]: DEBUG oslo_concurrency.lockutils [req-6b69f5ec-e8f6-49c2-9c97-49d20f63628b req-4a4437dc-1e64-4a54-9991-b4c5edd750d4 service nova] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 835.341972] env[67964]: DEBUG nova.compute.manager [req-6b69f5ec-e8f6-49c2-9c97-49d20f63628b req-4a4437dc-1e64-4a54-9991-b4c5edd750d4 service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] No waiting events found dispatching network-vif-plugged-c200fbe5-6838-4ca1-b819-327a8bbf5719 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 835.343717] env[67964]: WARNING nova.compute.manager [req-6b69f5ec-e8f6-49c2-9c97-49d20f63628b req-4a4437dc-1e64-4a54-9991-b4c5edd750d4 service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Received unexpected event network-vif-plugged-c200fbe5-6838-4ca1-b819-327a8bbf5719 for instance with vm_state building and task_state spawning. [ 835.659753] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456745, 'name': CreateVM_Task, 'duration_secs': 0.304399} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 835.660108] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 835.661012] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 835.661352] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 835.661807] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 835.662204] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0cb9938f-5d45-4b14-8d63-b908f32f168b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.667154] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Waiting for the task: (returnval){ [ 835.667154] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]522ab539-f503-033b-ef43-0dc84a23ac17" [ 835.667154] env[67964]: _type = "Task" [ 835.667154] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 835.675526] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]522ab539-f503-033b-ef43-0dc84a23ac17, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 835.800371] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 835.800845] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 836.179836] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 836.180043] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 836.180293] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 836.253697] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.368369] env[67964]: DEBUG oslo_concurrency.lockutils [None req-201f4a21-fd81-4562-9b01-167a88262704 tempest-FloatingIPsAssociationNegativeTestJSON-782376586 tempest-FloatingIPsAssociationNegativeTestJSON-782376586-project-member] Acquiring lock "ec330488-db38-486f-8d54-17afd9f07ce3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.368369] env[67964]: DEBUG oslo_concurrency.lockutils [None req-201f4a21-fd81-4562-9b01-167a88262704 tempest-FloatingIPsAssociationNegativeTestJSON-782376586 tempest-FloatingIPsAssociationNegativeTestJSON-782376586-project-member] Lock "ec330488-db38-486f-8d54-17afd9f07ce3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 836.800444] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 836.800779] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 836.800911] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 836.811245] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.811463] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 836.811642] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 836.811827] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 836.812939] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bb9fb29-040d-457f-b5f4-7e83b590cf4f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.821857] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6520cce6-c737-412e-94f8-da4a1751d47e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.836873] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f1f0dce-f297-49cf-badf-ed0e261d4515 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.843569] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8467d6f4-2c4d-4e8b-80b2-814d023591a0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.877569] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180910MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 836.877569] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.877756] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 836.953456] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953456] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 371aeb17-ad59-4a01-88f7-466dfee8d293 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953456] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 180338df-2738-4eeb-8610-cb130d04f6d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953456] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953738] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953738] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953738] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953738] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953866] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.953866] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 836.965672] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance b9a04994-804b-47b8-bc9f-cf4f18f27f5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 836.976398] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 658d81c6-dd54-4af5-b51f-6b0ce8fb9336 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 836.987242] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9dc1cdba-3991-4ee2-b92b-2800e17f07a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 836.999964] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9f51b0ff-a0fc-4a94-9d1e-578347c2f776 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.011877] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b5afa56-a56a-4990-9ba7-2c0955579a65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.023704] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance dcb0d0da-a987-46b2-be64-672ee3200eab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.035280] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2be271b8-775a-4c51-aa27-75a6a29e270b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.051677] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.063552] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.074647] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.084245] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6c329e27-945e-4996-9994-85d207c35325 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.096979] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.109249] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7aa99c10-9a4f-4b46-8fd6-e8a7def3c9bc tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Acquiring lock "ae1668bc-04cb-4767-847a-d2b7c3d95156" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 837.109473] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7aa99c10-9a4f-4b46-8fd6-e8a7def3c9bc tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "ae1668bc-04cb-4767-847a-d2b7c3d95156" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 837.109932] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09c05646-301a-4d74-957c-1c9c6b7ab44b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.119841] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f6aedef6-3d4d-4839-863b-771ac818a1c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.129208] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 34481f0e-b35a-4405-be54-ac23326f1183 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.139217] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.149053] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec330488-db38-486f-8d54-17afd9f07ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 837.149321] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 837.149472] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 837.490790] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65e75377-8101-42fb-b566-1aa9d6695961 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.498461] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aecdb1ed-8728-424d-a62c-ee3f80439b7e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.529690] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-883133fd-946e-4656-8d6d-43b4dab4575e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.537271] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebd8da3b-bdc9-457b-a27e-fd3b142aa243 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.550293] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 837.558506] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 837.575161] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 837.575346] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.698s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.744218] env[67964]: DEBUG nova.compute.manager [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Received event network-changed-c200fbe5-6838-4ca1-b819-327a8bbf5719 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 837.744335] env[67964]: DEBUG nova.compute.manager [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Refreshing instance network info cache due to event network-changed-c200fbe5-6838-4ca1-b819-327a8bbf5719. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 837.744542] env[67964]: DEBUG oslo_concurrency.lockutils [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] Acquiring lock "refresh_cache-9e47d3ce-3897-458b-ac85-d98745e9aeb5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 837.744741] env[67964]: DEBUG oslo_concurrency.lockutils [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] Acquired lock "refresh_cache-9e47d3ce-3897-458b-ac85-d98745e9aeb5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 837.744829] env[67964]: DEBUG nova.network.neutron [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Refreshing network info cache for port c200fbe5-6838-4ca1-b819-327a8bbf5719 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 838.293756] env[67964]: DEBUG nova.network.neutron [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Updated VIF entry in instance network info cache for port c200fbe5-6838-4ca1-b819-327a8bbf5719. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 838.294117] env[67964]: DEBUG nova.network.neutron [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Updating instance_info_cache with network_info: [{"id": "c200fbe5-6838-4ca1-b819-327a8bbf5719", "address": "fa:16:3e:72:bf:94", "network": {"id": "b1620391-3676-4162-8f61-cd282b73fd16", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1689141491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1b85d829ed64fec8f242ab1f4666cdf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "edd47158-6f4b-44a1-8e82-0411205ad299", "external-id": "nsx-vlan-transportzone-587", "segmentation_id": 587, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc200fbe5-68", "ovs_interfaceid": "c200fbe5-6838-4ca1-b819-327a8bbf5719", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 838.306572] env[67964]: DEBUG oslo_concurrency.lockutils [req-a14a3a26-8e93-4dd7-91da-00f668c9bbfd req-e703fab8-336d-4e80-af0e-a636a4aa685f service nova] Releasing lock "refresh_cache-9e47d3ce-3897-458b-ac85-d98745e9aeb5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 839.575379] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 839.575678] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 839.798798] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 839.800265] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 839.800428] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 839.800546] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 839.826691] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828559] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828559] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828559] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828559] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828559] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828816] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828816] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828816] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828816] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 839.828816] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 840.023275] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e4583d77-8828-4618-a6e8-afc30d5a2d0b tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] Acquiring lock "8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 840.023516] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e4583d77-8828-4618-a6e8-afc30d5a2d0b tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] Lock "8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 850.128918] env[67964]: DEBUG oslo_concurrency.lockutils [None req-36d3e9b9-cc8a-443f-964b-22673d5c9f2d tempest-AttachInterfacesV270Test-1237935671 tempest-AttachInterfacesV270Test-1237935671-project-member] Acquiring lock "7b96b0d5-a10c-4f7f-9113-46c85ea62dfe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.129234] env[67964]: DEBUG oslo_concurrency.lockutils [None req-36d3e9b9-cc8a-443f-964b-22673d5c9f2d tempest-AttachInterfacesV270Test-1237935671 tempest-AttachInterfacesV270Test-1237935671-project-member] Lock "7b96b0d5-a10c-4f7f-9113-46c85ea62dfe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 851.642947] env[67964]: DEBUG oslo_concurrency.lockutils [None req-296520c0-1166-474c-8c01-fbcea84330c5 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] Acquiring lock "94760699-7f13-42e2-abb2-45e3374eeccb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 851.643281] env[67964]: DEBUG oslo_concurrency.lockutils [None req-296520c0-1166-474c-8c01-fbcea84330c5 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] Lock "94760699-7f13-42e2-abb2-45e3374eeccb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 852.445904] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7ad16c3c-0f7e-4ccb-8ec7-4eab967472a4 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] Acquiring lock "e239df07-066e-4dff-8302-9945a610a43a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 852.446192] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7ad16c3c-0f7e-4ccb-8ec7-4eab967472a4 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] Lock "e239df07-066e-4dff-8302-9945a610a43a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 854.630737] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Acquiring lock "809c38e0-bc92-4a77-b307-773b6df211c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 854.631062] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "809c38e0-bc92-4a77-b307-773b6df211c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 854.659286] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Acquiring lock "18c8cc6b-a7aa-43fc-b048-1d788f4c162b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 854.659507] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "18c8cc6b-a7aa-43fc-b048-1d788f4c162b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 860.613020] env[67964]: DEBUG oslo_concurrency.lockutils [None req-96730b7e-0330-46b1-a456-bae29458dc8d tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "d90509f8-1957-4bb3-b4ec-eba8b37705b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 860.613020] env[67964]: DEBUG oslo_concurrency.lockutils [None req-96730b7e-0330-46b1-a456-bae29458dc8d tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "d90509f8-1957-4bb3-b4ec-eba8b37705b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 867.977860] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0b38ccca-91fa-44c4-80bb-fb028048d55d tempest-ServerActionsV293TestJSON-564338386 tempest-ServerActionsV293TestJSON-564338386-project-member] Acquiring lock "dabdde79-50a8-43fd-a998-868aec05d825" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.978273] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0b38ccca-91fa-44c4-80bb-fb028048d55d tempest-ServerActionsV293TestJSON-564338386 tempest-ServerActionsV293TestJSON-564338386-project-member] Lock "dabdde79-50a8-43fd-a998-868aec05d825" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 875.220389] env[67964]: WARNING oslo_vmware.rw_handles [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 875.220389] env[67964]: ERROR oslo_vmware.rw_handles [ 875.221137] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 875.222849] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 875.223150] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Copying Virtual Disk [datastore1] vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/358a8fe6-a398-4736-b394-abf9e698885e/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 875.223475] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-64448917-f40b-44a7-ad47-0f2cbce3a37f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.233311] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Waiting for the task: (returnval){ [ 875.233311] env[67964]: value = "task-3456756" [ 875.233311] env[67964]: _type = "Task" [ 875.233311] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 875.241980] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Task: {'id': task-3456756, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 875.743677] env[67964]: DEBUG oslo_vmware.exceptions [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 875.743970] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 875.744543] env[67964]: ERROR nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 875.744543] env[67964]: Faults: ['InvalidArgument'] [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Traceback (most recent call last): [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] yield resources [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self.driver.spawn(context, instance, image_meta, [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self._fetch_image_if_missing(context, vi) [ 875.744543] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] image_cache(vi, tmp_image_ds_loc) [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] vm_util.copy_virtual_disk( [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] session._wait_for_task(vmdk_copy_task) [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] return self.wait_for_task(task_ref) [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] return evt.wait() [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] result = hub.switch() [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 875.745017] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] return self.greenlet.switch() [ 875.745458] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 875.745458] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self.f(*self.args, **self.kw) [ 875.745458] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 875.745458] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] raise exceptions.translate_fault(task_info.error) [ 875.745458] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 875.745458] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Faults: ['InvalidArgument'] [ 875.745458] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] [ 875.745458] env[67964]: INFO nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Terminating instance [ 875.746373] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 875.746564] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 875.746796] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1843a891-e416-4e08-8b49-92faff54bd27 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.749065] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 875.749259] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 875.749951] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ece201e-9144-42b3-a789-5e1f349ad7c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.756588] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 875.756829] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6bc8853b-b270-485e-9d16-be1b62b5550a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.758944] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 875.759126] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 875.760075] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c406a5f9-234d-477d-b839-b20847fcf0f7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.764774] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Waiting for the task: (returnval){ [ 875.764774] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52dd2d5c-5869-1eaa-2560-9a5f6c84902e" [ 875.764774] env[67964]: _type = "Task" [ 875.764774] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 875.771559] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52dd2d5c-5869-1eaa-2560-9a5f6c84902e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 875.827666] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 875.827883] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 875.828081] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Deleting the datastore file [datastore1] 93509103-8c02-420d-bcaa-c2cf0847b1f0 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 875.828329] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d3723656-60ab-4999-ae6c-0987dae56270 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.834737] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Waiting for the task: (returnval){ [ 875.834737] env[67964]: value = "task-3456758" [ 875.834737] env[67964]: _type = "Task" [ 875.834737] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 875.842176] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Task: {'id': task-3456758, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 876.274895] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 876.275220] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Creating directory with path [datastore1] vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 876.275408] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f0219f87-c539-41f2-9f4e-4bcc0f14b129 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.287574] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Created directory with path [datastore1] vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 876.287745] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Fetch image to [datastore1] vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 876.287865] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 876.288637] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd6ac786-75cf-4b76-b904-ee152b2baaff {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.295520] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-174389f7-6d05-4b23-9aa7-c422421fbcde {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.305023] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a6aa1c7-4c10-4edc-b721-5c2380ac858c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.340029] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88bd363c-6ab9-4f31-bc03-0379a69719ab {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.347126] env[67964]: DEBUG oslo_vmware.api [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Task: {'id': task-3456758, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064326} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 876.348574] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 876.348813] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 876.348958] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 876.349143] env[67964]: INFO nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 876.351183] env[67964]: DEBUG nova.compute.claims [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 876.351363] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 876.351576] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 876.354312] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6740aedc-f0a0-4461-a8e4-00fb900c397e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.382636] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 876.440059] env[67964]: DEBUG oslo_vmware.rw_handles [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 876.498200] env[67964]: DEBUG nova.scheduler.client.report [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Refreshing inventories for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:818}} [ 876.502567] env[67964]: DEBUG oslo_vmware.rw_handles [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 876.502750] env[67964]: DEBUG oslo_vmware.rw_handles [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 876.515351] env[67964]: DEBUG nova.scheduler.client.report [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Updating ProviderTree inventory for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:782}} [ 876.515603] env[67964]: DEBUG nova.compute.provider_tree [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 876.528525] env[67964]: DEBUG nova.scheduler.client.report [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Refreshing aggregate associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, aggregates: None {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:827}} [ 876.546890] env[67964]: DEBUG nova.scheduler.client.report [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Refreshing trait associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:839}} [ 876.878852] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ea24138-ec7f-4ef6-b345-3e9ee86c4bf3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.886717] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c9325b7-382b-499e-9d4d-7dc3eb9555e1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.915654] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a00d3b5-2354-4fb1-8122-c6af1f19c510 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.924321] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff83a9fc-927e-42aa-be20-40bd602cdae0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.938373] env[67964]: DEBUG nova.compute.provider_tree [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 876.949062] env[67964]: DEBUG nova.scheduler.client.report [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 876.962528] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.611s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 876.963855] env[67964]: ERROR nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 876.963855] env[67964]: Faults: ['InvalidArgument'] [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Traceback (most recent call last): [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self.driver.spawn(context, instance, image_meta, [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self._fetch_image_if_missing(context, vi) [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] image_cache(vi, tmp_image_ds_loc) [ 876.963855] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] vm_util.copy_virtual_disk( [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] session._wait_for_task(vmdk_copy_task) [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] return self.wait_for_task(task_ref) [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] return evt.wait() [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] result = hub.switch() [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] return self.greenlet.switch() [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 876.964265] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] self.f(*self.args, **self.kw) [ 876.964603] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 876.964603] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] raise exceptions.translate_fault(task_info.error) [ 876.964603] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 876.964603] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Faults: ['InvalidArgument'] [ 876.964603] env[67964]: ERROR nova.compute.manager [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] [ 876.964603] env[67964]: DEBUG nova.compute.utils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 876.965335] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Build of instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 was re-scheduled: A specified parameter was not correct: fileType [ 876.965335] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 876.965744] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 876.965912] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 876.966077] env[67964]: DEBUG nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 876.966235] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 877.282764] env[67964]: DEBUG nova.network.neutron [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 877.294667] env[67964]: INFO nova.compute.manager [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Took 0.33 seconds to deallocate network for instance. [ 877.402244] env[67964]: INFO nova.scheduler.client.report [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Deleted allocations for instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 [ 877.423832] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8766be6d-82c4-4515-bb57-1bc5eeb01125 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 294.997s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.425581] env[67964]: DEBUG oslo_concurrency.lockutils [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 96.023s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.425855] env[67964]: DEBUG oslo_concurrency.lockutils [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Acquiring lock "93509103-8c02-420d-bcaa-c2cf0847b1f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 877.426149] env[67964]: DEBUG oslo_concurrency.lockutils [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.426384] env[67964]: DEBUG oslo_concurrency.lockutils [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.428555] env[67964]: INFO nova.compute.manager [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Terminating instance [ 877.430455] env[67964]: DEBUG nova.compute.manager [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 877.430640] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 877.431469] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f9fada88-5a68-4028-8912-096b3daf8296 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.435808] env[67964]: DEBUG nova.compute.manager [None req-aa031614-c845-4612-952a-5ada591a2966 tempest-AttachInterfacesUnderV243Test-544305701 tempest-AttachInterfacesUnderV243Test-544305701-project-member] [instance: b9a04994-804b-47b8-bc9f-cf4f18f27f5b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.445019] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2507d53-b06c-42c5-8025-95de631bf052 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.474384] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 93509103-8c02-420d-bcaa-c2cf0847b1f0 could not be found. [ 877.474677] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 877.474951] env[67964]: INFO nova.compute.manager [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 877.475305] env[67964]: DEBUG oslo.service.loopingcall [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 877.475766] env[67964]: DEBUG nova.compute.manager [None req-aa031614-c845-4612-952a-5ada591a2966 tempest-AttachInterfacesUnderV243Test-544305701 tempest-AttachInterfacesUnderV243Test-544305701-project-member] [instance: b9a04994-804b-47b8-bc9f-cf4f18f27f5b] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 877.477424] env[67964]: DEBUG nova.compute.manager [-] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 877.477587] env[67964]: DEBUG nova.network.neutron [-] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 877.509017] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa031614-c845-4612-952a-5ada591a2966 tempest-AttachInterfacesUnderV243Test-544305701 tempest-AttachInterfacesUnderV243Test-544305701-project-member] Lock "b9a04994-804b-47b8-bc9f-cf4f18f27f5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.046s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.509738] env[67964]: DEBUG nova.network.neutron [-] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 877.516816] env[67964]: DEBUG nova.compute.manager [None req-6858452e-abec-4bc0-8cfa-c37aa7286f1e tempest-ServerActionsTestOtherB-968802063 tempest-ServerActionsTestOtherB-968802063-project-member] [instance: 658d81c6-dd54-4af5-b51f-6b0ce8fb9336] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.520014] env[67964]: INFO nova.compute.manager [-] [instance: 93509103-8c02-420d-bcaa-c2cf0847b1f0] Took 0.04 seconds to deallocate network for instance. [ 877.541962] env[67964]: DEBUG nova.compute.manager [None req-6858452e-abec-4bc0-8cfa-c37aa7286f1e tempest-ServerActionsTestOtherB-968802063 tempest-ServerActionsTestOtherB-968802063-project-member] [instance: 658d81c6-dd54-4af5-b51f-6b0ce8fb9336] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 877.565042] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6858452e-abec-4bc0-8cfa-c37aa7286f1e tempest-ServerActionsTestOtherB-968802063 tempest-ServerActionsTestOtherB-968802063-project-member] Lock "658d81c6-dd54-4af5-b51f-6b0ce8fb9336" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.018s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.573969] env[67964]: DEBUG nova.compute.manager [None req-301e0b5c-07bc-43e1-892c-706f071bae3d tempest-ServersV294TestFqdnHostnames-1254988542 tempest-ServersV294TestFqdnHostnames-1254988542-project-member] [instance: 9dc1cdba-3991-4ee2-b92b-2800e17f07a8] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.610281] env[67964]: DEBUG nova.compute.manager [None req-301e0b5c-07bc-43e1-892c-706f071bae3d tempest-ServersV294TestFqdnHostnames-1254988542 tempest-ServersV294TestFqdnHostnames-1254988542-project-member] [instance: 9dc1cdba-3991-4ee2-b92b-2800e17f07a8] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 877.624991] env[67964]: DEBUG oslo_concurrency.lockutils [None req-38346a1b-6b35-47f0-a34b-6a70f4e1dc54 tempest-ServersAdminNegativeTestJSON-123213895 tempest-ServersAdminNegativeTestJSON-123213895-project-member] Lock "93509103-8c02-420d-bcaa-c2cf0847b1f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.634796] env[67964]: DEBUG oslo_concurrency.lockutils [None req-301e0b5c-07bc-43e1-892c-706f071bae3d tempest-ServersV294TestFqdnHostnames-1254988542 tempest-ServersV294TestFqdnHostnames-1254988542-project-member] Lock "9dc1cdba-3991-4ee2-b92b-2800e17f07a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.658s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.643747] env[67964]: DEBUG nova.compute.manager [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 9f51b0ff-a0fc-4a94-9d1e-578347c2f776] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.667580] env[67964]: DEBUG nova.compute.manager [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 9f51b0ff-a0fc-4a94-9d1e-578347c2f776] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 877.689045] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "9f51b0ff-a0fc-4a94-9d1e-578347c2f776" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.579s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.698432] env[67964]: DEBUG nova.compute.manager [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 8b5afa56-a56a-4990-9ba7-2c0955579a65] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.724695] env[67964]: DEBUG nova.compute.manager [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 8b5afa56-a56a-4990-9ba7-2c0955579a65] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 877.747353] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d5607a7b-08df-4fd9-88e7-d5361f97562d tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "8b5afa56-a56a-4990-9ba7-2c0955579a65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.610s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.757218] env[67964]: DEBUG nova.compute.manager [None req-8110f381-0d67-4dd9-9635-e69c1173f6e6 tempest-ServerActionsTestOtherA-1206453540 tempest-ServerActionsTestOtherA-1206453540-project-member] [instance: dcb0d0da-a987-46b2-be64-672ee3200eab] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.784781] env[67964]: DEBUG nova.compute.manager [None req-8110f381-0d67-4dd9-9635-e69c1173f6e6 tempest-ServerActionsTestOtherA-1206453540 tempest-ServerActionsTestOtherA-1206453540-project-member] [instance: dcb0d0da-a987-46b2-be64-672ee3200eab] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 877.811346] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8110f381-0d67-4dd9-9635-e69c1173f6e6 tempest-ServerActionsTestOtherA-1206453540 tempest-ServerActionsTestOtherA-1206453540-project-member] Lock "dcb0d0da-a987-46b2-be64-672ee3200eab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.531s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.820491] env[67964]: DEBUG nova.compute.manager [None req-5c869298-aaf3-4224-8d6d-dbd99e7c1448 tempest-InstanceActionsV221TestJSON-959559326 tempest-InstanceActionsV221TestJSON-959559326-project-member] [instance: 2be271b8-775a-4c51-aa27-75a6a29e270b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.844451] env[67964]: DEBUG nova.compute.manager [None req-5c869298-aaf3-4224-8d6d-dbd99e7c1448 tempest-InstanceActionsV221TestJSON-959559326 tempest-InstanceActionsV221TestJSON-959559326-project-member] [instance: 2be271b8-775a-4c51-aa27-75a6a29e270b] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 877.865522] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5c869298-aaf3-4224-8d6d-dbd99e7c1448 tempest-InstanceActionsV221TestJSON-959559326 tempest-InstanceActionsV221TestJSON-959559326-project-member] Lock "2be271b8-775a-4c51-aa27-75a6a29e270b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.173s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.874489] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 877.934069] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 877.934331] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.937033] env[67964]: INFO nova.compute.claims [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 878.315071] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6419e1c-e68e-4f1c-9542-38e271a4d5c7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.322801] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4a62e26-adc2-4547-b755-361ec68712e1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.352866] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23a9e209-3b97-4c84-80af-2e77184cb520 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.359764] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fec24a3-0bc1-4cd1-b77a-b4f1e87e0fe1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.372907] env[67964]: DEBUG nova.compute.provider_tree [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 878.383616] env[67964]: DEBUG nova.scheduler.client.report [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 878.397184] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.463s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.397655] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 878.429491] env[67964]: DEBUG nova.compute.utils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 878.431030] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 878.431215] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 878.442338] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 878.511035] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 878.540025] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 878.540025] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 878.540025] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 878.540249] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 878.544021] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 878.544021] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 878.544021] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 878.544021] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 878.544021] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 878.544354] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 878.544354] env[67964]: DEBUG nova.virt.hardware [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 878.544354] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-838bea24-7259-4976-b423-afc7cda58dc0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.546999] env[67964]: DEBUG nova.policy [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '63d0242278a644cfb1d5a71a91fd1a9e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f6de6f78fae4eaca255f09d977ff229', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 878.555886] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fedd0993-1ae3-426a-a3fd-7ce3d4e7a882 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.963866] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Successfully created port: 54e36d76-dc22-4d84-a23b-b83427f4fcea {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 880.368978] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Successfully updated port: 54e36d76-dc22-4d84-a23b-b83427f4fcea {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 880.386686] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "refresh_cache-ea492fb8-2352-436c-a7d5-f20423f4d353" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 880.386903] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquired lock "refresh_cache-ea492fb8-2352-436c-a7d5-f20423f4d353" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 880.386989] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 880.463150] env[67964]: DEBUG nova.compute.manager [req-ec94f035-4b82-42e2-bb81-151463af7637 req-9d8c1d01-aa81-4016-a492-4b919ef1db69 service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Received event network-vif-plugged-54e36d76-dc22-4d84-a23b-b83427f4fcea {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 880.463150] env[67964]: DEBUG oslo_concurrency.lockutils [req-ec94f035-4b82-42e2-bb81-151463af7637 req-9d8c1d01-aa81-4016-a492-4b919ef1db69 service nova] Acquiring lock "ea492fb8-2352-436c-a7d5-f20423f4d353-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.463150] env[67964]: DEBUG oslo_concurrency.lockutils [req-ec94f035-4b82-42e2-bb81-151463af7637 req-9d8c1d01-aa81-4016-a492-4b919ef1db69 service nova] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.463150] env[67964]: DEBUG oslo_concurrency.lockutils [req-ec94f035-4b82-42e2-bb81-151463af7637 req-9d8c1d01-aa81-4016-a492-4b919ef1db69 service nova] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.463311] env[67964]: DEBUG nova.compute.manager [req-ec94f035-4b82-42e2-bb81-151463af7637 req-9d8c1d01-aa81-4016-a492-4b919ef1db69 service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] No waiting events found dispatching network-vif-plugged-54e36d76-dc22-4d84-a23b-b83427f4fcea {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 880.463311] env[67964]: WARNING nova.compute.manager [req-ec94f035-4b82-42e2-bb81-151463af7637 req-9d8c1d01-aa81-4016-a492-4b919ef1db69 service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Received unexpected event network-vif-plugged-54e36d76-dc22-4d84-a23b-b83427f4fcea for instance with vm_state building and task_state spawning. [ 880.580406] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 881.150185] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Updating instance_info_cache with network_info: [{"id": "54e36d76-dc22-4d84-a23b-b83427f4fcea", "address": "fa:16:3e:4b:9b:2b", "network": {"id": "9f52956a-9dea-4dfe-986e-4c1aabbd4010", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1975544848-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "2f6de6f78fae4eaca255f09d977ff229", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b5a629f-6902-4d30-9278-74b443a8371d", "external-id": "nsx-vlan-transportzone-185", "segmentation_id": 185, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54e36d76-dc", "ovs_interfaceid": "54e36d76-dc22-4d84-a23b-b83427f4fcea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 881.164018] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Releasing lock "refresh_cache-ea492fb8-2352-436c-a7d5-f20423f4d353" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 881.164018] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Instance network_info: |[{"id": "54e36d76-dc22-4d84-a23b-b83427f4fcea", "address": "fa:16:3e:4b:9b:2b", "network": {"id": "9f52956a-9dea-4dfe-986e-4c1aabbd4010", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1975544848-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "2f6de6f78fae4eaca255f09d977ff229", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b5a629f-6902-4d30-9278-74b443a8371d", "external-id": "nsx-vlan-transportzone-185", "segmentation_id": 185, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54e36d76-dc", "ovs_interfaceid": "54e36d76-dc22-4d84-a23b-b83427f4fcea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 881.164224] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4b:9b:2b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b5a629f-6902-4d30-9278-74b443a8371d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '54e36d76-dc22-4d84-a23b-b83427f4fcea', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 881.170784] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Creating folder: Project (2f6de6f78fae4eaca255f09d977ff229). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 881.171543] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7da7e1fe-4cfb-4654-bd2f-15247339b7aa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.182680] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Created folder: Project (2f6de6f78fae4eaca255f09d977ff229) in parent group-v690366. [ 881.183069] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Creating folder: Instances. Parent ref: group-v690420. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 881.183410] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1eb6a070-8835-49f1-a269-134fa1ef3f64 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.191747] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Created folder: Instances in parent group-v690420. [ 881.192116] env[67964]: DEBUG oslo.service.loopingcall [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 881.192401] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 881.192690] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-151c162c-ee34-438f-9914-1b43a5803658 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.214020] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 881.214020] env[67964]: value = "task-3456761" [ 881.214020] env[67964]: _type = "Task" [ 881.214020] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 881.220610] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456761, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 881.722968] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456761, 'name': CreateVM_Task, 'duration_secs': 0.36329} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 881.723617] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 881.724416] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 881.724638] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 881.725007] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 881.725354] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f15653e3-088a-4d7a-b2cb-c395b5208ee7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.730630] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Waiting for the task: (returnval){ [ 881.730630] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5295387b-3fd6-81cd-fd94-19399cb7228d" [ 881.730630] env[67964]: _type = "Task" [ 881.730630] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 881.739589] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5295387b-3fd6-81cd-fd94-19399cb7228d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 882.245448] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 882.245725] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 882.245947] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 882.631253] env[67964]: DEBUG nova.compute.manager [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Received event network-changed-54e36d76-dc22-4d84-a23b-b83427f4fcea {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 882.631471] env[67964]: DEBUG nova.compute.manager [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Refreshing instance network info cache due to event network-changed-54e36d76-dc22-4d84-a23b-b83427f4fcea. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 882.631610] env[67964]: DEBUG oslo_concurrency.lockutils [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] Acquiring lock "refresh_cache-ea492fb8-2352-436c-a7d5-f20423f4d353" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 882.631749] env[67964]: DEBUG oslo_concurrency.lockutils [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] Acquired lock "refresh_cache-ea492fb8-2352-436c-a7d5-f20423f4d353" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 882.631949] env[67964]: DEBUG nova.network.neutron [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Refreshing network info cache for port 54e36d76-dc22-4d84-a23b-b83427f4fcea {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 883.037515] env[67964]: DEBUG nova.network.neutron [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Updated VIF entry in instance network info cache for port 54e36d76-dc22-4d84-a23b-b83427f4fcea. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 883.037866] env[67964]: DEBUG nova.network.neutron [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Updating instance_info_cache with network_info: [{"id": "54e36d76-dc22-4d84-a23b-b83427f4fcea", "address": "fa:16:3e:4b:9b:2b", "network": {"id": "9f52956a-9dea-4dfe-986e-4c1aabbd4010", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1975544848-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "2f6de6f78fae4eaca255f09d977ff229", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b5a629f-6902-4d30-9278-74b443a8371d", "external-id": "nsx-vlan-transportzone-185", "segmentation_id": 185, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54e36d76-dc", "ovs_interfaceid": "54e36d76-dc22-4d84-a23b-b83427f4fcea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 883.047722] env[67964]: DEBUG oslo_concurrency.lockutils [req-dade725a-aa5a-420a-9850-573f6961ea26 req-6dcb966e-ab2e-4ccf-82a2-a60b475442ad service nova] Releasing lock "refresh_cache-ea492fb8-2352-436c-a7d5-f20423f4d353" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 894.800800] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 895.046923] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "67eb58c3-a895-4427-9197-3b0c731a123a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 895.047196] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "67eb58c3-a895-4427-9197-3b0c731a123a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 896.800744] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 896.801084] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 896.801337] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 896.801525] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 896.815803] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 896.816041] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 896.816228] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 896.816383] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 896.817529] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b32ddcdd-f16f-4d99-83de-fd81866d133e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 896.827973] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62ca8ed0-f27b-47df-bc5c-b7bc55f0c551 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 896.841786] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af019c4f-cda6-40f7-a42e-9e464d97853b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 896.848185] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1d47fae-8c4a-4d76-ba1b-3da19d5beffd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 896.878587] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180909MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 896.878587] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 896.878587] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 896.952542] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 371aeb17-ad59-4a01-88f7-466dfee8d293 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.952542] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 180338df-2738-4eeb-8610-cb130d04f6d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.952542] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.952672] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.952751] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.953048] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.953181] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.953306] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.953425] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.953543] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 896.964925] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 896.976875] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 896.986193] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6c329e27-945e-4996-9994-85d207c35325 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 896.997917] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.008608] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09c05646-301a-4d74-957c-1c9c6b7ab44b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.020247] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f6aedef6-3d4d-4839-863b-771ac818a1c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.030687] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 34481f0e-b35a-4405-be54-ac23326f1183 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.041113] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.050943] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec330488-db38-486f-8d54-17afd9f07ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.061647] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ae1668bc-04cb-4767-847a-d2b7c3d95156 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.072291] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.083386] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7b96b0d5-a10c-4f7f-9113-46c85ea62dfe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.094786] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 94760699-7f13-42e2-abb2-45e3374eeccb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.105373] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance e239df07-066e-4dff-8302-9945a610a43a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.115764] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 809c38e0-bc92-4a77-b307-773b6df211c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.128246] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c8cc6b-a7aa-43fc-b048-1d788f4c162b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.137985] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d90509f8-1957-4bb3-b4ec-eba8b37705b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.147654] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance dabdde79-50a8-43fd-a998-868aec05d825 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.157401] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 897.157690] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 897.157885] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 897.481589] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bccd7537-12f1-4c76-9610-df8db12a7821 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.489343] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a02ac09d-ddaa-4464-9e65-ea9c236df682 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.519689] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c85d125-d414-49fa-897a-87a857a21459 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.527014] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a375b4d5-f554-418e-aad0-1ca7b2bb8cfa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.539996] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 897.548332] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 897.561603] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 897.561780] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 898.561773] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 898.800166] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 898.800304] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 899.800834] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 899.801146] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 899.801211] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 899.820280] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.820439] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.820571] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.820698] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.820863] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.821036] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.821290] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.821422] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.821551] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.821663] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 899.821782] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 900.817337] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 900.817636] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 925.293932] env[67964]: WARNING oslo_vmware.rw_handles [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 925.293932] env[67964]: ERROR oslo_vmware.rw_handles [ 925.294778] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 925.296303] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 925.296561] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Copying Virtual Disk [datastore1] vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/f26a6c9b-0cbc-43c8-8ec7-c1a896a0084d/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 925.296848] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-40bba022-5160-48ed-8fc8-b4d1ea85a300 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 925.305131] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Waiting for the task: (returnval){ [ 925.305131] env[67964]: value = "task-3456762" [ 925.305131] env[67964]: _type = "Task" [ 925.305131] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 925.312796] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Task: {'id': task-3456762, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 925.815544] env[67964]: DEBUG oslo_vmware.exceptions [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 925.815866] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 925.816440] env[67964]: ERROR nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 925.816440] env[67964]: Faults: ['InvalidArgument'] [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Traceback (most recent call last): [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] yield resources [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self.driver.spawn(context, instance, image_meta, [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self._vmops.spawn(context, instance, image_meta, injected_files, [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self._fetch_image_if_missing(context, vi) [ 925.816440] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] image_cache(vi, tmp_image_ds_loc) [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] vm_util.copy_virtual_disk( [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] session._wait_for_task(vmdk_copy_task) [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] return self.wait_for_task(task_ref) [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] return evt.wait() [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] result = hub.switch() [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 925.816852] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] return self.greenlet.switch() [ 925.817286] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 925.817286] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self.f(*self.args, **self.kw) [ 925.817286] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 925.817286] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] raise exceptions.translate_fault(task_info.error) [ 925.817286] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 925.817286] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Faults: ['InvalidArgument'] [ 925.817286] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] [ 925.817286] env[67964]: INFO nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Terminating instance [ 925.819484] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 925.819673] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 925.819984] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 925.820196] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 925.820997] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bd71030-3448-4092-942f-2ba3d39f2db6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 925.823692] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-54b91d75-7df7-4eb0-94bf-cb19d5735318 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 925.829966] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 925.830202] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-72cefe25-228c-4e36-9705-6976841ce719 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 925.833026] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 925.833026] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 925.833507] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f87d1b1-6440-4d4f-9ba0-777bf40c98d6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 925.838195] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Waiting for the task: (returnval){ [ 925.838195] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52caee86-7126-f419-3124-a7be80bd8d12" [ 925.838195] env[67964]: _type = "Task" [ 925.838195] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 925.845353] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52caee86-7126-f419-3124-a7be80bd8d12, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 925.908636] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 925.908858] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 925.909050] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Deleting the datastore file [datastore1] 371aeb17-ad59-4a01-88f7-466dfee8d293 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 925.909380] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1332d651-cdca-4105-b520-1afa3d17f186 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 925.915411] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Waiting for the task: (returnval){ [ 925.915411] env[67964]: value = "task-3456764" [ 925.915411] env[67964]: _type = "Task" [ 925.915411] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 925.922824] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Task: {'id': task-3456764, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 926.349063] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 926.349460] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Creating directory with path [datastore1] vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 926.349601] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-77418993-fe06-427c-b54a-efb33d418919 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.362284] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Created directory with path [datastore1] vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 926.362500] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Fetch image to [datastore1] vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 926.362635] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 926.363414] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ecacc1b-93d7-43b1-8363-92f99f8b3edc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.370328] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00264f36-b784-4c3e-a9e5-d66a6c7ebce7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.379711] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b425315-afe8-46b0-8957-7ccb88099080 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.413685] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aba89ad-3ece-4941-b02c-6e6dc05394e7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.425681] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2b3cfaeb-2071-489e-ab8a-26757db5a026 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.428033] env[67964]: DEBUG oslo_vmware.api [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Task: {'id': task-3456764, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071914} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 926.428033] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 926.428033] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 926.428220] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 926.428220] env[67964]: INFO nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Took 0.61 seconds to destroy the instance on the hypervisor. [ 926.430744] env[67964]: DEBUG nova.compute.claims [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 926.430977] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 926.431160] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 926.448763] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 926.503898] env[67964]: DEBUG oslo_vmware.rw_handles [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 926.566031] env[67964]: DEBUG oslo_vmware.rw_handles [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 926.566031] env[67964]: DEBUG oslo_vmware.rw_handles [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 926.934224] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d25c940-4b1c-4391-b48b-4cce4db8bbc9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.942552] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88884919-e1e4-4dd6-851d-ff50a44217d8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.973821] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf1909d0-292f-4b64-b117-f85955b93992 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.981264] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ce8560d-958d-4ca6-a725-dc059608bd1b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.994839] env[67964]: DEBUG nova.compute.provider_tree [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 927.004652] env[67964]: DEBUG nova.scheduler.client.report [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 927.021664] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.590s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 927.022443] env[67964]: ERROR nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 927.022443] env[67964]: Faults: ['InvalidArgument'] [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Traceback (most recent call last): [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self.driver.spawn(context, instance, image_meta, [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self._vmops.spawn(context, instance, image_meta, injected_files, [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self._fetch_image_if_missing(context, vi) [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] image_cache(vi, tmp_image_ds_loc) [ 927.022443] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] vm_util.copy_virtual_disk( [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] session._wait_for_task(vmdk_copy_task) [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] return self.wait_for_task(task_ref) [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] return evt.wait() [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] result = hub.switch() [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] return self.greenlet.switch() [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 927.022916] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] self.f(*self.args, **self.kw) [ 927.023337] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 927.023337] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] raise exceptions.translate_fault(task_info.error) [ 927.023337] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 927.023337] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Faults: ['InvalidArgument'] [ 927.023337] env[67964]: ERROR nova.compute.manager [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] [ 927.023616] env[67964]: DEBUG nova.compute.utils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 927.025625] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Build of instance 371aeb17-ad59-4a01-88f7-466dfee8d293 was re-scheduled: A specified parameter was not correct: fileType [ 927.025625] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 927.026020] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 927.026198] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 927.026349] env[67964]: DEBUG nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 927.026504] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 927.547735] env[67964]: DEBUG nova.network.neutron [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 927.561489] env[67964]: INFO nova.compute.manager [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Took 0.53 seconds to deallocate network for instance. [ 927.664585] env[67964]: INFO nova.scheduler.client.report [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Deleted allocations for instance 371aeb17-ad59-4a01-88f7-466dfee8d293 [ 927.691932] env[67964]: DEBUG oslo_concurrency.lockutils [None req-dfb385f8-50dd-4889-816a-90d4cf495509 tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 344.845s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 927.692545] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 146.527s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 927.692794] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Acquiring lock "371aeb17-ad59-4a01-88f7-466dfee8d293-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 927.692974] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 927.693158] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 927.696876] env[67964]: INFO nova.compute.manager [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Terminating instance [ 927.697685] env[67964]: DEBUG nova.compute.manager [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 927.697873] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 927.698372] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7e55a998-75f1-4a62-84da-963cf586cbe0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 927.709187] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35b394f9-1d7f-4cb5-8595-82e93ce49cac {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 927.721570] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 927.743154] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 371aeb17-ad59-4a01-88f7-466dfee8d293 could not be found. [ 927.745741] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 927.745741] env[67964]: INFO nova.compute.manager [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Took 0.05 seconds to destroy the instance on the hypervisor. [ 927.745741] env[67964]: DEBUG oslo.service.loopingcall [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 927.745741] env[67964]: DEBUG nova.compute.manager [-] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 927.745741] env[67964]: DEBUG nova.network.neutron [-] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 927.777985] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 927.778327] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 927.780050] env[67964]: INFO nova.compute.claims [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 927.783483] env[67964]: DEBUG nova.network.neutron [-] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 927.794037] env[67964]: INFO nova.compute.manager [-] [instance: 371aeb17-ad59-4a01-88f7-466dfee8d293] Took 0.05 seconds to deallocate network for instance. [ 927.930311] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0dfbf6ad-8771-428d-8c2b-d447430a397e tempest-ServerDiagnosticsNegativeTest-537727777 tempest-ServerDiagnosticsNegativeTest-537727777-project-member] Lock "371aeb17-ad59-4a01-88f7-466dfee8d293" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.238s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.257995] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c94ced1-a43b-4a6e-a307-3876471e2be4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.266909] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10744730-9f8f-4652-8881-4ca4d0243b01 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.298221] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f73ce079-1972-4e25-a78a-36c7473c281c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.305897] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45fbe256-c991-4025-a5db-ac77a4134685 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.319630] env[67964]: DEBUG nova.compute.provider_tree [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 928.330557] env[67964]: DEBUG nova.scheduler.client.report [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 928.345909] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.568s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.346424] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 928.380042] env[67964]: DEBUG nova.compute.utils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 928.381760] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 928.381943] env[67964]: DEBUG nova.network.neutron [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 928.390877] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 928.471748] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 928.475138] env[67964]: DEBUG nova.policy [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a763e8aff6184f72b8f07826702fe981', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ece3f919eb54985b4ab3bf0a9362717', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 928.510531] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 928.510778] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 928.510933] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 928.511133] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 928.511307] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 928.511455] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 928.511734] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 928.511842] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 928.512014] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 928.512215] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 928.512398] env[67964]: DEBUG nova.virt.hardware [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 928.513559] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b72900-341a-4fcf-b95d-7b57c7effcab {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.521783] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f80538ba-4b0c-4720-a141-2c4ea470834b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.934950] env[67964]: DEBUG nova.network.neutron [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Successfully created port: 8905f8b0-5662-4af4-a127-95f4596f4cdf {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 929.961087] env[67964]: DEBUG nova.network.neutron [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Successfully updated port: 8905f8b0-5662-4af4-a127-95f4596f4cdf {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 929.977640] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "refresh_cache-c648c89a-ca70-4a15-9083-0cbe9e5bee23" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 929.977793] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired lock "refresh_cache-c648c89a-ca70-4a15-9083-0cbe9e5bee23" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 929.977945] env[67964]: DEBUG nova.network.neutron [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 930.026167] env[67964]: DEBUG nova.network.neutron [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 930.165067] env[67964]: DEBUG nova.compute.manager [req-49878b6d-2203-4251-8f38-1d5f8cc0d544 req-d1d4636a-d0dd-4aeb-9051-1745eb7e54d2 service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Received event network-vif-plugged-8905f8b0-5662-4af4-a127-95f4596f4cdf {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 930.165426] env[67964]: DEBUG oslo_concurrency.lockutils [req-49878b6d-2203-4251-8f38-1d5f8cc0d544 req-d1d4636a-d0dd-4aeb-9051-1745eb7e54d2 service nova] Acquiring lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 930.165752] env[67964]: DEBUG oslo_concurrency.lockutils [req-49878b6d-2203-4251-8f38-1d5f8cc0d544 req-d1d4636a-d0dd-4aeb-9051-1745eb7e54d2 service nova] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 930.166056] env[67964]: DEBUG oslo_concurrency.lockutils [req-49878b6d-2203-4251-8f38-1d5f8cc0d544 req-d1d4636a-d0dd-4aeb-9051-1745eb7e54d2 service nova] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.166347] env[67964]: DEBUG nova.compute.manager [req-49878b6d-2203-4251-8f38-1d5f8cc0d544 req-d1d4636a-d0dd-4aeb-9051-1745eb7e54d2 service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] No waiting events found dispatching network-vif-plugged-8905f8b0-5662-4af4-a127-95f4596f4cdf {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 930.166612] env[67964]: WARNING nova.compute.manager [req-49878b6d-2203-4251-8f38-1d5f8cc0d544 req-d1d4636a-d0dd-4aeb-9051-1745eb7e54d2 service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Received unexpected event network-vif-plugged-8905f8b0-5662-4af4-a127-95f4596f4cdf for instance with vm_state building and task_state spawning. [ 930.243600] env[67964]: DEBUG nova.network.neutron [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Updating instance_info_cache with network_info: [{"id": "8905f8b0-5662-4af4-a127-95f4596f4cdf", "address": "fa:16:3e:01:19:36", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8905f8b0-56", "ovs_interfaceid": "8905f8b0-5662-4af4-a127-95f4596f4cdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 930.256087] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Releasing lock "refresh_cache-c648c89a-ca70-4a15-9083-0cbe9e5bee23" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 930.256460] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Instance network_info: |[{"id": "8905f8b0-5662-4af4-a127-95f4596f4cdf", "address": "fa:16:3e:01:19:36", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8905f8b0-56", "ovs_interfaceid": "8905f8b0-5662-4af4-a127-95f4596f4cdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 930.256851] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:01:19:36', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8905f8b0-5662-4af4-a127-95f4596f4cdf', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 930.267246] env[67964]: DEBUG oslo.service.loopingcall [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 930.268654] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 930.270069] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-900680c1-5996-490b-b045-e4e91fb49c78 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.295207] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 930.295207] env[67964]: value = "task-3456765" [ 930.295207] env[67964]: _type = "Task" [ 930.295207] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 930.305942] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456765, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 930.807682] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456765, 'name': CreateVM_Task, 'duration_secs': 0.288048} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 930.807996] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 930.808702] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 930.808892] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 930.809237] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 930.809505] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5f51aa37-3ad2-45a5-a511-692168f16844 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.814520] env[67964]: DEBUG oslo_vmware.api [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for the task: (returnval){ [ 930.814520] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5210a22e-33e7-0a6d-6d20-5bd549a39252" [ 930.814520] env[67964]: _type = "Task" [ 930.814520] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 930.828283] env[67964]: DEBUG oslo_vmware.api [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5210a22e-33e7-0a6d-6d20-5bd549a39252, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 931.332363] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 931.332631] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 931.332850] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 932.310859] env[67964]: DEBUG nova.compute.manager [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Received event network-changed-8905f8b0-5662-4af4-a127-95f4596f4cdf {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 932.311081] env[67964]: DEBUG nova.compute.manager [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Refreshing instance network info cache due to event network-changed-8905f8b0-5662-4af4-a127-95f4596f4cdf. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 932.311433] env[67964]: DEBUG oslo_concurrency.lockutils [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] Acquiring lock "refresh_cache-c648c89a-ca70-4a15-9083-0cbe9e5bee23" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 932.311576] env[67964]: DEBUG oslo_concurrency.lockutils [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] Acquired lock "refresh_cache-c648c89a-ca70-4a15-9083-0cbe9e5bee23" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 932.311609] env[67964]: DEBUG nova.network.neutron [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Refreshing network info cache for port 8905f8b0-5662-4af4-a127-95f4596f4cdf {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 932.667444] env[67964]: DEBUG nova.network.neutron [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Updated VIF entry in instance network info cache for port 8905f8b0-5662-4af4-a127-95f4596f4cdf. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 932.667444] env[67964]: DEBUG nova.network.neutron [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Updating instance_info_cache with network_info: [{"id": "8905f8b0-5662-4af4-a127-95f4596f4cdf", "address": "fa:16:3e:01:19:36", "network": {"id": "f0006a9c-61de-4bf3-93f5-2254d58c780e", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "34fc8bdd38bd4d2781a21b19049364a0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8905f8b0-56", "ovs_interfaceid": "8905f8b0-5662-4af4-a127-95f4596f4cdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 932.677842] env[67964]: DEBUG oslo_concurrency.lockutils [req-f36f3f9a-41f9-49b9-a607-e667af184d79 req-9cd116ad-ddc9-4eec-b792-176c0ab9bc5e service nova] Releasing lock "refresh_cache-c648c89a-ca70-4a15-9083-0cbe9e5bee23" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 936.578572] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 936.578892] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.247719] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01951162-df92-4a21-8fe9-8c5624bd2208 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "4877dc66-8ac8-4f7e-9a49-97a7adb95e72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.247947] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01951162-df92-4a21-8fe9-8c5624bd2208 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "4877dc66-8ac8-4f7e-9a49-97a7adb95e72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 938.075954] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "ea492fb8-2352-436c-a7d5-f20423f4d353" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 956.800354] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 956.800605] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 956.800765] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 956.812181] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 956.812445] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 956.812617] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 956.812774] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 956.813975] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cf2570f-6209-4521-814b-637aa2351755 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.823040] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7439bcc5-0844-4a65-96e3-3e396a204297 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.836837] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b35bf1c-ce30-40bd-9c5c-150c14b8ad87 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.843252] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d557aae-e763-4ad7-a9fe-3a96183e7f4d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.873417] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180883MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 956.873601] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 956.873766] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 956.952844] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 180338df-2738-4eeb-8610-cb130d04f6d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953015] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953154] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953278] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953396] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953510] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953626] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953739] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953850] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.953960] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 956.981816] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 956.994682] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6c329e27-945e-4996-9994-85d207c35325 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.006025] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.016205] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09c05646-301a-4d74-957c-1c9c6b7ab44b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.026821] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f6aedef6-3d4d-4839-863b-771ac818a1c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.037122] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 34481f0e-b35a-4405-be54-ac23326f1183 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.046511] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.056383] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec330488-db38-486f-8d54-17afd9f07ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.068814] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ae1668bc-04cb-4767-847a-d2b7c3d95156 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.078153] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.089881] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7b96b0d5-a10c-4f7f-9113-46c85ea62dfe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.103110] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 94760699-7f13-42e2-abb2-45e3374eeccb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.112476] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance e239df07-066e-4dff-8302-9945a610a43a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.122573] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 809c38e0-bc92-4a77-b307-773b6df211c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.131838] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c8cc6b-a7aa-43fc-b048-1d788f4c162b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.141603] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d90509f8-1957-4bb3-b4ec-eba8b37705b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.151129] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance dabdde79-50a8-43fd-a998-868aec05d825 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.161883] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.172461] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.181452] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 4877dc66-8ac8-4f7e-9a49-97a7adb95e72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 957.181704] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 957.181852] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 957.514669] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57589b32-4e7f-4305-8a11-89fc48ca5bb9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.522049] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86dbeb1d-4a98-48ef-822f-783ff0524d04 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.551783] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29161b5e-1750-48d7-9ae4-d0e28170484c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.558762] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43a7f95c-4a9a-4a12-8c2f-68d4106090d7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.571238] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 957.579746] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 957.595566] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 957.595774] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.722s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.596257] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 958.596530] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 959.800541] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 959.800541] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 960.801272] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 960.801544] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 960.801719] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 960.825702] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.825794] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826059] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826059] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826198] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826288] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826406] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826526] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826641] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826753] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 960.826869] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 960.827376] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 961.821832] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 974.351919] env[67964]: WARNING oslo_vmware.rw_handles [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 974.351919] env[67964]: ERROR oslo_vmware.rw_handles [ 974.352553] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 974.354331] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 974.354491] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Copying Virtual Disk [datastore1] vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/ed0adb0d-dab6-4ffd-b4ec-97cb1ef94adc/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 974.354786] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-10182f69-1004-42df-892d-abdc0282a62f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.362407] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Waiting for the task: (returnval){ [ 974.362407] env[67964]: value = "task-3456766" [ 974.362407] env[67964]: _type = "Task" [ 974.362407] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 974.370375] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Task: {'id': task-3456766, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 974.874870] env[67964]: DEBUG oslo_vmware.exceptions [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 974.875157] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 974.875697] env[67964]: ERROR nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 974.875697] env[67964]: Faults: ['InvalidArgument'] [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Traceback (most recent call last): [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] yield resources [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self.driver.spawn(context, instance, image_meta, [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self._fetch_image_if_missing(context, vi) [ 974.875697] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] image_cache(vi, tmp_image_ds_loc) [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] vm_util.copy_virtual_disk( [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] session._wait_for_task(vmdk_copy_task) [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] return self.wait_for_task(task_ref) [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] return evt.wait() [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] result = hub.switch() [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 974.876016] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] return self.greenlet.switch() [ 974.876446] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 974.876446] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self.f(*self.args, **self.kw) [ 974.876446] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 974.876446] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] raise exceptions.translate_fault(task_info.error) [ 974.876446] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 974.876446] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Faults: ['InvalidArgument'] [ 974.876446] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] [ 974.876446] env[67964]: INFO nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Terminating instance [ 974.877563] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 974.877767] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 974.878007] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-37029a92-62a1-4693-86a8-cfcd034779f2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.880263] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 974.880323] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 974.881033] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8d8db31-d3a6-45aa-939e-2297885d9c01 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.887543] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 974.887748] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9c976e30-9b53-4702-9f0e-5bb4d05d85d0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.889844] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 974.890022] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 974.890919] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ddb5122-ea73-4207-a9ab-71b80dd45fbb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.895353] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Waiting for the task: (returnval){ [ 974.895353] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]523acea5-d385-9ac0-188a-f8eaf41e7bb8" [ 974.895353] env[67964]: _type = "Task" [ 974.895353] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 974.910154] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 974.910154] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Creating directory with path [datastore1] vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 974.910304] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0f82a2c3-0ea1-40e7-a23d-1a31dfa2ab65 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.931422] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Created directory with path [datastore1] vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 974.931646] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Fetch image to [datastore1] vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 974.931815] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 974.932601] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e518066f-53ea-44be-8517-43cd761b2f54 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.939376] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f86e41cd-b0ce-45c3-bf5b-6f167b2a2be8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.948177] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50658a55-cf1e-450d-9e58-15d6e7adc9b2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.979570] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c14fe711-c81a-4650-823c-1713cc270f88 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.982084] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 974.982280] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 974.982480] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Deleting the datastore file [datastore1] 180338df-2738-4eeb-8610-cb130d04f6d2 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 974.982707] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f4312cc4-3544-49cf-adef-5aa5f2c89c9e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.987496] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c952b421-ee99-47dc-b72f-c419d139b626 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 974.990230] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Waiting for the task: (returnval){ [ 974.990230] env[67964]: value = "task-3456768" [ 974.990230] env[67964]: _type = "Task" [ 974.990230] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 974.997470] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Task: {'id': task-3456768, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 975.010882] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 975.063602] env[67964]: DEBUG oslo_vmware.rw_handles [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 975.122436] env[67964]: DEBUG oslo_vmware.rw_handles [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 975.122680] env[67964]: DEBUG oslo_vmware.rw_handles [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 975.500799] env[67964]: DEBUG oslo_vmware.api [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Task: {'id': task-3456768, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.096634} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 975.501435] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 975.501435] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 975.501636] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 975.501636] env[67964]: INFO nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Took 0.62 seconds to destroy the instance on the hypervisor. [ 975.503778] env[67964]: DEBUG nova.compute.claims [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 975.503964] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 975.504209] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 975.917037] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b141b25-f615-41ce-b43a-89cb1f2ade48 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.924806] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd37bfec-feec-4eaf-a371-c1dc8e22aa57 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.954483] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d360ed11-03e1-476f-9af0-5d35649fd59f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.962945] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33cea848-0c0e-4c85-b831-34ecb8903641 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.976118] env[67964]: DEBUG nova.compute.provider_tree [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 975.984851] env[67964]: DEBUG nova.scheduler.client.report [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 976.005893] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.502s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 976.006432] env[67964]: ERROR nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.006432] env[67964]: Faults: ['InvalidArgument'] [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Traceback (most recent call last): [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self.driver.spawn(context, instance, image_meta, [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self._fetch_image_if_missing(context, vi) [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] image_cache(vi, tmp_image_ds_loc) [ 976.006432] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] vm_util.copy_virtual_disk( [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] session._wait_for_task(vmdk_copy_task) [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] return self.wait_for_task(task_ref) [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] return evt.wait() [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] result = hub.switch() [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] return self.greenlet.switch() [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 976.006770] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] self.f(*self.args, **self.kw) [ 976.007117] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 976.007117] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] raise exceptions.translate_fault(task_info.error) [ 976.007117] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.007117] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Faults: ['InvalidArgument'] [ 976.007117] env[67964]: ERROR nova.compute.manager [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] [ 976.007250] env[67964]: DEBUG nova.compute.utils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 976.008540] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Build of instance 180338df-2738-4eeb-8610-cb130d04f6d2 was re-scheduled: A specified parameter was not correct: fileType [ 976.008540] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 976.008922] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 976.009111] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 976.009283] env[67964]: DEBUG nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 976.009447] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 976.319040] env[67964]: DEBUG nova.network.neutron [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 976.332915] env[67964]: INFO nova.compute.manager [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Took 0.32 seconds to deallocate network for instance. [ 976.426448] env[67964]: INFO nova.scheduler.client.report [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Deleted allocations for instance 180338df-2738-4eeb-8610-cb130d04f6d2 [ 976.446466] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f730ad8f-0192-459f-b488-f1ad0891d4a8 tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "180338df-2738-4eeb-8610-cb130d04f6d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 388.080s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 976.447654] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "180338df-2738-4eeb-8610-cb130d04f6d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 187.482s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 976.447875] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Acquiring lock "180338df-2738-4eeb-8610-cb130d04f6d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 976.448090] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "180338df-2738-4eeb-8610-cb130d04f6d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 976.448253] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "180338df-2738-4eeb-8610-cb130d04f6d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 976.451914] env[67964]: INFO nova.compute.manager [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Terminating instance [ 976.453730] env[67964]: DEBUG nova.compute.manager [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 976.453963] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 976.454617] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cd9ccd4e-fa2e-4411-bfbd-0d7c0f2b6eeb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.460168] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 976.468040] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e796280-f994-425c-88d1-839d440be32c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.496664] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 180338df-2738-4eeb-8610-cb130d04f6d2 could not be found. [ 976.496879] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 976.497070] env[67964]: INFO nova.compute.manager [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 976.497329] env[67964]: DEBUG oslo.service.loopingcall [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 976.497571] env[67964]: DEBUG nova.compute.manager [-] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 976.497670] env[67964]: DEBUG nova.network.neutron [-] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 976.519017] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 976.519305] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 976.520764] env[67964]: INFO nova.compute.claims [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 976.532845] env[67964]: DEBUG nova.network.neutron [-] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 976.549712] env[67964]: INFO nova.compute.manager [-] [instance: 180338df-2738-4eeb-8610-cb130d04f6d2] Took 0.05 seconds to deallocate network for instance. [ 976.642993] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71b5b1d6-ae00-4b1d-af29-e786806cdb3b tempest-ImagesOneServerNegativeTestJSON-1686241114 tempest-ImagesOneServerNegativeTestJSON-1686241114-project-member] Lock "180338df-2738-4eeb-8610-cb130d04f6d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 976.935513] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d35bdd61-0a91-4849-88a0-78a0b65b76b1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.943426] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5afde7b3-3b21-422a-a382-ae2995979311 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.973424] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c1d9f38-a333-4f83-a846-b545954f5791 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.980666] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13aca6c8-9002-43c1-8731-58b923e23aea {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.994915] env[67964]: DEBUG nova.compute.provider_tree [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 977.004701] env[67964]: DEBUG nova.scheduler.client.report [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 977.021479] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.502s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 977.021987] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 977.058833] env[67964]: DEBUG nova.compute.utils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 977.060120] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 977.060290] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 977.068873] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 977.132759] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 977.136143] env[67964]: DEBUG nova.policy [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4582636e1ee74b61878e4c1badbd563e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15502e37757142d4afa0577a3e80bfb8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 977.163142] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 977.163319] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 977.163484] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 977.163661] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 977.163804] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 977.163946] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 977.164168] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 977.164469] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 977.164765] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 977.165201] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 977.165397] env[67964]: DEBUG nova.virt.hardware [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 977.166456] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a43cf3c0-7371-4d2e-b2cc-f23f61a8c18a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.174617] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-533f32c1-1612-4555-b7fb-826f2f2c8beb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.480567] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Successfully created port: 1dbc0906-ad66-4179-9ca5-30d96757badd {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 978.272081] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Successfully updated port: 1dbc0906-ad66-4179-9ca5-30d96757badd {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 978.285064] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "refresh_cache-9793d383-9033-4f86-b7bb-6b2e43347cd6" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 978.285242] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "refresh_cache-9793d383-9033-4f86-b7bb-6b2e43347cd6" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 978.285400] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 978.343619] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 978.530567] env[67964]: DEBUG nova.compute.manager [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Received event network-vif-plugged-1dbc0906-ad66-4179-9ca5-30d96757badd {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 978.531571] env[67964]: DEBUG oslo_concurrency.lockutils [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] Acquiring lock "9793d383-9033-4f86-b7bb-6b2e43347cd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 978.531571] env[67964]: DEBUG oslo_concurrency.lockutils [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 978.531571] env[67964]: DEBUG oslo_concurrency.lockutils [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 978.531571] env[67964]: DEBUG nova.compute.manager [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] No waiting events found dispatching network-vif-plugged-1dbc0906-ad66-4179-9ca5-30d96757badd {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 978.531900] env[67964]: WARNING nova.compute.manager [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Received unexpected event network-vif-plugged-1dbc0906-ad66-4179-9ca5-30d96757badd for instance with vm_state building and task_state spawning. [ 978.531900] env[67964]: DEBUG nova.compute.manager [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Received event network-changed-1dbc0906-ad66-4179-9ca5-30d96757badd {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 978.531900] env[67964]: DEBUG nova.compute.manager [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Refreshing instance network info cache due to event network-changed-1dbc0906-ad66-4179-9ca5-30d96757badd. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 978.532057] env[67964]: DEBUG oslo_concurrency.lockutils [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] Acquiring lock "refresh_cache-9793d383-9033-4f86-b7bb-6b2e43347cd6" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 978.915347] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Updating instance_info_cache with network_info: [{"id": "1dbc0906-ad66-4179-9ca5-30d96757badd", "address": "fa:16:3e:0f:90:c4", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dbc0906-ad", "ovs_interfaceid": "1dbc0906-ad66-4179-9ca5-30d96757badd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 978.928189] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "refresh_cache-9793d383-9033-4f86-b7bb-6b2e43347cd6" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 978.928436] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Instance network_info: |[{"id": "1dbc0906-ad66-4179-9ca5-30d96757badd", "address": "fa:16:3e:0f:90:c4", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dbc0906-ad", "ovs_interfaceid": "1dbc0906-ad66-4179-9ca5-30d96757badd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 978.928716] env[67964]: DEBUG oslo_concurrency.lockutils [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] Acquired lock "refresh_cache-9793d383-9033-4f86-b7bb-6b2e43347cd6" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 978.928883] env[67964]: DEBUG nova.network.neutron [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Refreshing network info cache for port 1dbc0906-ad66-4179-9ca5-30d96757badd {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 978.930020] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0f:90:c4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b4d548e7-d762-406a-bb2d-dc7168a8ca67', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1dbc0906-ad66-4179-9ca5-30d96757badd', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 978.937883] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating folder: Project (15502e37757142d4afa0577a3e80bfb8). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 978.940700] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2de82e4c-f95d-428f-8432-a092a1ca7570 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.952565] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created folder: Project (15502e37757142d4afa0577a3e80bfb8) in parent group-v690366. [ 978.956034] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating folder: Instances. Parent ref: group-v690424. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 978.956034] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fabe77d2-2a90-45f5-aecb-52903c14b13d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 979.002422] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created folder: Instances in parent group-v690424. [ 979.002422] env[67964]: DEBUG oslo.service.loopingcall [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 979.002422] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 979.002422] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e042b346-68a8-47c1-a786-189a22b9ecfb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 979.002422] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 979.002422] env[67964]: value = "task-3456771" [ 979.002422] env[67964]: _type = "Task" [ 979.002422] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 979.002422] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456771, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 979.492046] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456771, 'name': CreateVM_Task, 'duration_secs': 0.290574} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 979.492364] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 979.492927] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 979.493157] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 979.493541] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 979.493828] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c45f6464-615f-4679-8556-fb5a54d2788f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 979.496501] env[67964]: DEBUG nova.network.neutron [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Updated VIF entry in instance network info cache for port 1dbc0906-ad66-4179-9ca5-30d96757badd. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 979.496854] env[67964]: DEBUG nova.network.neutron [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Updating instance_info_cache with network_info: [{"id": "1dbc0906-ad66-4179-9ca5-30d96757badd", "address": "fa:16:3e:0f:90:c4", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dbc0906-ad", "ovs_interfaceid": "1dbc0906-ad66-4179-9ca5-30d96757badd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 979.499166] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 979.499166] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52c75146-93f4-72b0-8f9b-e2561021659d" [ 979.499166] env[67964]: _type = "Task" [ 979.499166] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 979.516198] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52c75146-93f4-72b0-8f9b-e2561021659d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 979.516746] env[67964]: DEBUG oslo_concurrency.lockutils [req-0f472e7f-6b09-464f-b851-ae48aea3ed7c req-40211cb1-224e-4617-b210-0f61875c5fcc service nova] Releasing lock "refresh_cache-9793d383-9033-4f86-b7bb-6b2e43347cd6" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 980.009935] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 980.010235] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 980.010615] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 980.029469] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 985.810295] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 985.810621] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 996.603689] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a9a03e68-2cc5-4370-b58f-05f3d7670428 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "57445f5b-8a3a-4d55-b926-ee2d3e24b6ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 996.603977] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a9a03e68-2cc5-4370-b58f-05f3d7670428 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "57445f5b-8a3a-4d55-b926-ee2d3e24b6ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 997.891203] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a0e5a5de-4a9a-4103-b8c2-447de0f46a3a tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "53718899-b65f-4e3b-a8d6-7277e946ab43" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 997.891552] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a0e5a5de-4a9a-4103-b8c2-447de0f46a3a tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "53718899-b65f-4e3b-a8d6-7277e946ab43" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.472085] env[67964]: DEBUG oslo_concurrency.lockutils [None req-349d2e60-f63a-4842-8fb9-4995dfb65b9c tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "236faf76-d72e-4c2b-9b44-9d1866491310" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.472379] env[67964]: DEBUG oslo_concurrency.lockutils [None req-349d2e60-f63a-4842-8fb9-4995dfb65b9c tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "236faf76-d72e-4c2b-9b44-9d1866491310" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.628466] env[67964]: DEBUG oslo_concurrency.lockutils [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.379999] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fe2ae47e-5133-4c60-b3e2-6249def18f17 tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] Acquiring lock "f36ba9db-c547-4d77-9e49-24bfcc995e89" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.381153] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fe2ae47e-5133-4c60-b3e2-6249def18f17 tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] Lock "f36ba9db-c547-4d77-9e49-24bfcc995e89" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1016.799740] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1017.800668] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1017.800960] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1017.816476] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1017.816723] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1017.816940] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1017.817118] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1017.818361] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08d2e3e0-3131-47d1-8435-790f0d4c989c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.827424] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74c37e08-a8c0-4713-ac59-71e0f9411a53 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.842509] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f52d63a7-3329-48a1-abbf-fafd5482da33 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.848741] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21dbbe75-7c49-4592-be3a-c4492c2a1d1e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.878815] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180841MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1017.878902] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1017.879092] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1017.955411] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8b261c6e-741c-4d6c-9567-566af85cd68f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.955626] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9c586d33-c563-45c7-8c54-1638a78a669c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.955809] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.955965] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.956146] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.956297] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.956445] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.956587] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.956729] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.956867] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1017.969152] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6c329e27-945e-4996-9994-85d207c35325 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1017.980463] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1017.989901] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09c05646-301a-4d74-957c-1c9c6b7ab44b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.000139] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f6aedef6-3d4d-4839-863b-771ac818a1c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.011374] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 34481f0e-b35a-4405-be54-ac23326f1183 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.021763] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.031382] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec330488-db38-486f-8d54-17afd9f07ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.041081] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ae1668bc-04cb-4767-847a-d2b7c3d95156 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.051457] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.062936] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7b96b0d5-a10c-4f7f-9113-46c85ea62dfe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.073293] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 94760699-7f13-42e2-abb2-45e3374eeccb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.083015] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance e239df07-066e-4dff-8302-9945a610a43a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.092650] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 809c38e0-bc92-4a77-b307-773b6df211c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.102457] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c8cc6b-a7aa-43fc-b048-1d788f4c162b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.112212] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d90509f8-1957-4bb3-b4ec-eba8b37705b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.124148] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance dabdde79-50a8-43fd-a998-868aec05d825 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.133981] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.144027] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.154475] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 4877dc66-8ac8-4f7e-9a49-97a7adb95e72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.164381] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.177611] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 57445f5b-8a3a-4d55-b926-ee2d3e24b6ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.186156] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 53718899-b65f-4e3b-a8d6-7277e946ab43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.199420] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 236faf76-d72e-4c2b-9b44-9d1866491310 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.208740] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f36ba9db-c547-4d77-9e49-24bfcc995e89 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1018.208987] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1018.209160] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1018.579698] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76d2a11e-7ffa-4c7d-9f93-f1424186f606 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.587239] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-764cb007-a5ff-4eea-872d-5696c82be3f6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.616516] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24e5b910-b6dd-4256-a5f7-08b76bb74b14 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.623193] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a344919e-bc95-47af-a7d1-b70ff6d3bbd3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.635962] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1018.644689] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1018.659402] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1018.659593] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.780s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1019.659421] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1019.659834] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1019.801129] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1020.800395] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1021.800107] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1022.483018] env[67964]: WARNING oslo_vmware.rw_handles [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1022.483018] env[67964]: ERROR oslo_vmware.rw_handles [ 1022.483018] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1022.484353] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1022.484603] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Copying Virtual Disk [datastore1] vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/30923cbe-3772-4752-9591-1fdff197ecd6/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1022.484994] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-626a3314-04e6-431c-9a1a-799d5e6750ad {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1022.493271] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Waiting for the task: (returnval){ [ 1022.493271] env[67964]: value = "task-3456772" [ 1022.493271] env[67964]: _type = "Task" [ 1022.493271] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1022.501352] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Task: {'id': task-3456772, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1022.800369] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1022.800574] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1022.800699] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1022.826094] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826094] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826094] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826094] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826094] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826462] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826462] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826462] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.826776] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.827174] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1022.827508] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1023.003973] env[67964]: DEBUG oslo_vmware.exceptions [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1023.004315] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1023.004845] env[67964]: ERROR nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1023.004845] env[67964]: Faults: ['InvalidArgument'] [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Traceback (most recent call last): [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] yield resources [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self.driver.spawn(context, instance, image_meta, [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self._fetch_image_if_missing(context, vi) [ 1023.004845] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] image_cache(vi, tmp_image_ds_loc) [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] vm_util.copy_virtual_disk( [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] session._wait_for_task(vmdk_copy_task) [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] return self.wait_for_task(task_ref) [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] return evt.wait() [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] result = hub.switch() [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1023.005225] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] return self.greenlet.switch() [ 1023.005571] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1023.005571] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self.f(*self.args, **self.kw) [ 1023.005571] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1023.005571] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] raise exceptions.translate_fault(task_info.error) [ 1023.005571] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1023.005571] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Faults: ['InvalidArgument'] [ 1023.005571] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] [ 1023.005571] env[67964]: INFO nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Terminating instance [ 1023.006662] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1023.006866] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.007119] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-546f1e87-b4e4-4dab-849b-3567e0e51d91 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.009426] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1023.009630] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1023.010342] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb703a6d-abd5-4b82-b213-69d358225e99 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.016936] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1023.017124] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f898e143-cb6c-4aab-a6ea-82648bbf21ee {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.019246] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.019414] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1023.020368] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4e7402e5-995c-460c-aaf5-1c170b78c619 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.024983] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Waiting for the task: (returnval){ [ 1023.024983] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]523ed95e-95d1-a6af-536d-6cb773e11034" [ 1023.024983] env[67964]: _type = "Task" [ 1023.024983] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1023.033312] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]523ed95e-95d1-a6af-536d-6cb773e11034, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1023.078547] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1023.078849] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1023.079045] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Deleting the datastore file [datastore1] 8b261c6e-741c-4d6c-9567-566af85cd68f {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1023.079318] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-66c85cec-d811-449c-9f3d-a991337733ad {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.086212] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Waiting for the task: (returnval){ [ 1023.086212] env[67964]: value = "task-3456774" [ 1023.086212] env[67964]: _type = "Task" [ 1023.086212] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1023.094323] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Task: {'id': task-3456774, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1023.535603] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1023.535861] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Creating directory with path [datastore1] vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.536107] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-96a37d6a-1cde-464c-b8c4-d4f785c501e5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.547434] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Created directory with path [datastore1] vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.547640] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Fetch image to [datastore1] vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1023.547810] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1023.548570] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65a31243-8f9d-45b0-b808-e6d4fd9d6591 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.555278] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8f001c2-68ca-46d9-b446-a11522b48e87 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.564141] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8640903-9d90-4336-8c7a-16956e2c2efb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.596677] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ee4b08d-2b7e-4ca5-b459-067820a8f321 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.604466] env[67964]: DEBUG oslo_vmware.api [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Task: {'id': task-3456774, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.0801} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1023.605923] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1023.606120] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1023.606289] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1023.606458] env[67964]: INFO nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1023.608239] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8c812b87-454f-4d6b-88c1-fee6d33f98c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.610304] env[67964]: DEBUG nova.compute.claims [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1023.610304] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1023.610993] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1023.631563] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1023.683040] env[67964]: DEBUG oslo_vmware.rw_handles [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1023.744073] env[67964]: DEBUG oslo_vmware.rw_handles [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1023.744352] env[67964]: DEBUG oslo_vmware.rw_handles [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1023.823186] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1024.078804] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16452d69-c655-4cbc-8e5a-7ffd8d07bfc2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.086531] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26c9d0f3-4e27-46c9-a5f7-05501459674e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.116768] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7246824a-1c88-4203-bf55-1d1b26b03dfa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.124888] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5952210-917c-454b-915c-93d6d975c979 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.137664] env[67964]: DEBUG nova.compute.provider_tree [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1024.152467] env[67964]: DEBUG nova.scheduler.client.report [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1024.180397] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.570s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.180995] env[67964]: ERROR nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1024.180995] env[67964]: Faults: ['InvalidArgument'] [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Traceback (most recent call last): [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self.driver.spawn(context, instance, image_meta, [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self._fetch_image_if_missing(context, vi) [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] image_cache(vi, tmp_image_ds_loc) [ 1024.180995] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] vm_util.copy_virtual_disk( [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] session._wait_for_task(vmdk_copy_task) [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] return self.wait_for_task(task_ref) [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] return evt.wait() [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] result = hub.switch() [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] return self.greenlet.switch() [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1024.181319] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] self.f(*self.args, **self.kw) [ 1024.181619] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1024.181619] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] raise exceptions.translate_fault(task_info.error) [ 1024.181619] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1024.181619] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Faults: ['InvalidArgument'] [ 1024.181619] env[67964]: ERROR nova.compute.manager [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] [ 1024.181803] env[67964]: DEBUG nova.compute.utils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1024.183346] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Build of instance 8b261c6e-741c-4d6c-9567-566af85cd68f was re-scheduled: A specified parameter was not correct: fileType [ 1024.183346] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1024.183718] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1024.183889] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1024.184105] env[67964]: DEBUG nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1024.184330] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1024.640022] env[67964]: DEBUG nova.network.neutron [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1024.644732] env[67964]: INFO nova.compute.manager [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Took 0.46 seconds to deallocate network for instance. [ 1024.745130] env[67964]: INFO nova.scheduler.client.report [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Deleted allocations for instance 8b261c6e-741c-4d6c-9567-566af85cd68f [ 1024.770185] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6b3ffc70-7e1e-4c98-90b6-8011656496af tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 435.475s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.773694] env[67964]: DEBUG oslo_concurrency.lockutils [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 234.955s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.773694] env[67964]: DEBUG oslo_concurrency.lockutils [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Acquiring lock "8b261c6e-741c-4d6c-9567-566af85cd68f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1024.773694] env[67964]: DEBUG oslo_concurrency.lockutils [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.774024] env[67964]: DEBUG oslo_concurrency.lockutils [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.775167] env[67964]: INFO nova.compute.manager [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Terminating instance [ 1024.777958] env[67964]: DEBUG nova.compute.manager [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1024.777958] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1024.777958] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3d6c08a0-7e03-4e6b-99aa-8b7b022b1270 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.787861] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54de4ec4-bd2b-4fb3-935a-6038f5080cce {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.800983] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1024.807014] env[67964]: DEBUG nova.compute.manager [None req-3deb3d74-23c8-49db-845f-18a0428a7b24 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 6c329e27-945e-4996-9994-85d207c35325] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1024.828119] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8b261c6e-741c-4d6c-9567-566af85cd68f could not be found. [ 1024.828325] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1024.828496] env[67964]: INFO nova.compute.manager [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1024.828735] env[67964]: DEBUG oslo.service.loopingcall [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1024.831579] env[67964]: DEBUG nova.compute.manager [-] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1024.831671] env[67964]: DEBUG nova.network.neutron [-] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1024.834553] env[67964]: DEBUG nova.compute.manager [None req-3deb3d74-23c8-49db-845f-18a0428a7b24 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 6c329e27-945e-4996-9994-85d207c35325] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1024.866072] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3deb3d74-23c8-49db-845f-18a0428a7b24 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "6c329e27-945e-4996-9994-85d207c35325" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.528s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.868244] env[67964]: DEBUG nova.network.neutron [-] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1024.879206] env[67964]: INFO nova.compute.manager [-] [instance: 8b261c6e-741c-4d6c-9567-566af85cd68f] Took 0.05 seconds to deallocate network for instance. [ 1024.885250] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1024.952116] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1024.952116] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.953033] env[67964]: INFO nova.compute.claims [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1025.028732] env[67964]: DEBUG oslo_concurrency.lockutils [None req-20959877-1cd3-47cf-a18c-1d57e921fdfa tempest-ImagesOneServerTestJSON-1759666333 tempest-ImagesOneServerTestJSON-1759666333-project-member] Lock "8b261c6e-741c-4d6c-9567-566af85cd68f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.257s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1025.126501] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "9cd7ef82-147a-4303-a773-32b161f819ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1025.126733] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "9cd7ef82-147a-4303-a773-32b161f819ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1025.423836] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0ca3e23-ef90-44a4-bf1f-804bb3796395 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.431334] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69365b42-fd40-4e27-bad5-d892ca10820c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.461929] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-048cd15f-d236-4963-8606-d97f9be9ded4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.469289] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88f6aac7-fc71-411e-ab15-4c5d20ec1105 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.482618] env[67964]: DEBUG nova.compute.provider_tree [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1025.491323] env[67964]: DEBUG nova.scheduler.client.report [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1025.504962] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.554s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1025.505456] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1025.537725] env[67964]: DEBUG nova.compute.utils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1025.539139] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1025.539315] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1025.548191] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1025.593411] env[67964]: DEBUG nova.policy [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a7375dc1bc74238baad8152b81f0bb8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d9f3bca639141e9b9222aa98fe44aaf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1025.611436] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1025.637082] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1025.637082] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1025.637082] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1025.637279] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1025.637279] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1025.637349] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1025.637550] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1025.637696] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1025.637850] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1025.638013] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1025.638187] env[67964]: DEBUG nova.virt.hardware [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1025.639035] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eed3b1eb-b7f9-40b9-92ff-c0709a14ca61 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.646724] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8942a6b2-5700-4b12-a88e-3e41906e8288 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.991556] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Successfully created port: 68af4695-d6dd-4214-9a04-31cf648d0de1 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1026.759112] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Successfully updated port: 68af4695-d6dd-4214-9a04-31cf648d0de1 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1026.779265] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "refresh_cache-5fbee4c3-bc7c-4582-b976-b0d619a69cdb" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1026.779409] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquired lock "refresh_cache-5fbee4c3-bc7c-4582-b976-b0d619a69cdb" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1026.779559] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1026.826344] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1026.886218] env[67964]: DEBUG nova.compute.manager [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Received event network-vif-plugged-68af4695-d6dd-4214-9a04-31cf648d0de1 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1026.886342] env[67964]: DEBUG oslo_concurrency.lockutils [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] Acquiring lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.886540] env[67964]: DEBUG oslo_concurrency.lockutils [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1026.888311] env[67964]: DEBUG oslo_concurrency.lockutils [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1026.888311] env[67964]: DEBUG nova.compute.manager [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] No waiting events found dispatching network-vif-plugged-68af4695-d6dd-4214-9a04-31cf648d0de1 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1026.888311] env[67964]: WARNING nova.compute.manager [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Received unexpected event network-vif-plugged-68af4695-d6dd-4214-9a04-31cf648d0de1 for instance with vm_state building and task_state spawning. [ 1026.888311] env[67964]: DEBUG nova.compute.manager [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Received event network-changed-68af4695-d6dd-4214-9a04-31cf648d0de1 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1026.888438] env[67964]: DEBUG nova.compute.manager [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Refreshing instance network info cache due to event network-changed-68af4695-d6dd-4214-9a04-31cf648d0de1. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1026.888438] env[67964]: DEBUG oslo_concurrency.lockutils [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] Acquiring lock "refresh_cache-5fbee4c3-bc7c-4582-b976-b0d619a69cdb" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1027.039525] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Updating instance_info_cache with network_info: [{"id": "68af4695-d6dd-4214-9a04-31cf648d0de1", "address": "fa:16:3e:1d:e9:6b", "network": {"id": "be168e1c-db65-44f2-b839-98ba6b7bb12c", "bridge": "br-int", "label": "tempest-ServersTestJSON-699649795-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d9f3bca639141e9b9222aa98fe44aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65e4a2b4-fd64-4ac9-b2ec-bac768b501c5", "external-id": "nsx-vlan-transportzone-449", "segmentation_id": 449, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68af4695-d6", "ovs_interfaceid": "68af4695-d6dd-4214-9a04-31cf648d0de1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.054338] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Releasing lock "refresh_cache-5fbee4c3-bc7c-4582-b976-b0d619a69cdb" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1027.054631] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Instance network_info: |[{"id": "68af4695-d6dd-4214-9a04-31cf648d0de1", "address": "fa:16:3e:1d:e9:6b", "network": {"id": "be168e1c-db65-44f2-b839-98ba6b7bb12c", "bridge": "br-int", "label": "tempest-ServersTestJSON-699649795-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d9f3bca639141e9b9222aa98fe44aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65e4a2b4-fd64-4ac9-b2ec-bac768b501c5", "external-id": "nsx-vlan-transportzone-449", "segmentation_id": 449, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68af4695-d6", "ovs_interfaceid": "68af4695-d6dd-4214-9a04-31cf648d0de1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1027.054963] env[67964]: DEBUG oslo_concurrency.lockutils [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] Acquired lock "refresh_cache-5fbee4c3-bc7c-4582-b976-b0d619a69cdb" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1027.055163] env[67964]: DEBUG nova.network.neutron [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Refreshing network info cache for port 68af4695-d6dd-4214-9a04-31cf648d0de1 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1027.056174] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1d:e9:6b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '65e4a2b4-fd64-4ac9-b2ec-bac768b501c5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '68af4695-d6dd-4214-9a04-31cf648d0de1', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1027.064439] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Creating folder: Project (6d9f3bca639141e9b9222aa98fe44aaf). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1027.065605] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-953927b2-6a65-4031-9a76-85d220fb0010 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.080052] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Created folder: Project (6d9f3bca639141e9b9222aa98fe44aaf) in parent group-v690366. [ 1027.080052] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Creating folder: Instances. Parent ref: group-v690427. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1027.080052] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8b7d8b74-d3e3-4776-b662-b343be4f9ce6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.089644] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Created folder: Instances in parent group-v690427. [ 1027.089876] env[67964]: DEBUG oslo.service.loopingcall [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1027.090064] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1027.090259] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-352f4ed4-43c7-42be-ad3e-8a50545b0e72 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.111392] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1027.111392] env[67964]: value = "task-3456777" [ 1027.111392] env[67964]: _type = "Task" [ 1027.111392] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1027.119280] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456777, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1027.569346] env[67964]: DEBUG nova.network.neutron [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Updated VIF entry in instance network info cache for port 68af4695-d6dd-4214-9a04-31cf648d0de1. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1027.569700] env[67964]: DEBUG nova.network.neutron [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Updating instance_info_cache with network_info: [{"id": "68af4695-d6dd-4214-9a04-31cf648d0de1", "address": "fa:16:3e:1d:e9:6b", "network": {"id": "be168e1c-db65-44f2-b839-98ba6b7bb12c", "bridge": "br-int", "label": "tempest-ServersTestJSON-699649795-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d9f3bca639141e9b9222aa98fe44aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65e4a2b4-fd64-4ac9-b2ec-bac768b501c5", "external-id": "nsx-vlan-transportzone-449", "segmentation_id": 449, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68af4695-d6", "ovs_interfaceid": "68af4695-d6dd-4214-9a04-31cf648d0de1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.581473] env[67964]: DEBUG oslo_concurrency.lockutils [req-4c635db4-f77e-4d33-afbf-1ae75ed404df req-ea56c203-a099-4fc6-b06e-e828f8a83360 service nova] Releasing lock "refresh_cache-5fbee4c3-bc7c-4582-b976-b0d619a69cdb" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1027.621906] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456777, 'name': CreateVM_Task, 'duration_secs': 0.275334} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1027.622229] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1027.622788] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1027.622982] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1027.623369] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1027.623608] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9fff0e5f-9230-4ff4-9e3f-76d2cc67e58e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.628155] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Waiting for the task: (returnval){ [ 1027.628155] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]522a57ff-c5eb-20ff-4e3a-139f949f7a36" [ 1027.628155] env[67964]: _type = "Task" [ 1027.628155] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1027.635681] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]522a57ff-c5eb-20ff-4e3a-139f949f7a36, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1028.138325] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1028.138713] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1028.138795] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1031.477032] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1044.167536] env[67964]: DEBUG oslo_concurrency.lockutils [None req-58718077-6acc-4c85-9126-d2d0e0a2a01e tempest-ServerAddressesNegativeTestJSON-188398278 tempest-ServerAddressesNegativeTestJSON-188398278-project-member] Acquiring lock "fbf2ae36-60a6-48e2-b115-22b13b5c4cc2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1044.167863] env[67964]: DEBUG oslo_concurrency.lockutils [None req-58718077-6acc-4c85-9126-d2d0e0a2a01e tempest-ServerAddressesNegativeTestJSON-188398278 tempest-ServerAddressesNegativeTestJSON-188398278-project-member] Lock "fbf2ae36-60a6-48e2-b115-22b13b5c4cc2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1044.547359] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2a223a76-9979-4324-acf0-9a83c140325b tempest-ServersTestManualDisk-2066344586 tempest-ServersTestManualDisk-2066344586-project-member] Acquiring lock "02b1d6da-0aa2-4199-a86a-fa5b197b2813" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1044.547663] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2a223a76-9979-4324-acf0-9a83c140325b tempest-ServersTestManualDisk-2066344586 tempest-ServersTestManualDisk-2066344586-project-member] Lock "02b1d6da-0aa2-4199-a86a-fa5b197b2813" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1046.762331] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d8e3d421-ba5f-4b4b-8141-6a42e305c2d2 tempest-InstanceActionsTestJSON-1556599388 tempest-InstanceActionsTestJSON-1556599388-project-member] Acquiring lock "b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1046.762915] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d8e3d421-ba5f-4b4b-8141-6a42e305c2d2 tempest-InstanceActionsTestJSON-1556599388 tempest-InstanceActionsTestJSON-1556599388-project-member] Lock "b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1054.510665] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0f1243af-7816-42a7-a312-8fc9b42208f3 tempest-ServerShowV257Test-1393857793 tempest-ServerShowV257Test-1393857793-project-member] Acquiring lock "4cdc869e-2b97-4107-ae4d-49f99131048a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1054.511036] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0f1243af-7816-42a7-a312-8fc9b42208f3 tempest-ServerShowV257Test-1393857793 tempest-ServerShowV257Test-1393857793-project-member] Lock "4cdc869e-2b97-4107-ae4d-49f99131048a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1057.288027] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a275a597-95db-4676-8dbb-b8a4cecea5c7 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "7fe6f046-65c9-4464-931c-07e781c497aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1057.288311] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a275a597-95db-4676-8dbb-b8a4cecea5c7 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "7fe6f046-65c9-4464-931c-07e781c497aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1070.290155] env[67964]: WARNING oslo_vmware.rw_handles [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1070.290155] env[67964]: ERROR oslo_vmware.rw_handles [ 1070.290861] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1070.292656] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1070.292939] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Copying Virtual Disk [datastore1] vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/a7acba5a-e00c-4c8f-97c2-d7d57980fb33/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1070.293270] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ecc90480-fda8-4276-8dfd-f3b3f9b32c60 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.301054] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Waiting for the task: (returnval){ [ 1070.301054] env[67964]: value = "task-3456778" [ 1070.301054] env[67964]: _type = "Task" [ 1070.301054] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1070.308817] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Task: {'id': task-3456778, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1070.811312] env[67964]: DEBUG oslo_vmware.exceptions [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1070.811594] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1070.812766] env[67964]: ERROR nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1070.812766] env[67964]: Faults: ['InvalidArgument'] [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Traceback (most recent call last): [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] yield resources [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self.driver.spawn(context, instance, image_meta, [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self._fetch_image_if_missing(context, vi) [ 1070.812766] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] image_cache(vi, tmp_image_ds_loc) [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] vm_util.copy_virtual_disk( [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] session._wait_for_task(vmdk_copy_task) [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] return self.wait_for_task(task_ref) [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] return evt.wait() [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] result = hub.switch() [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1070.813296] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] return self.greenlet.switch() [ 1070.813665] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1070.813665] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self.f(*self.args, **self.kw) [ 1070.813665] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1070.813665] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] raise exceptions.translate_fault(task_info.error) [ 1070.813665] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1070.813665] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Faults: ['InvalidArgument'] [ 1070.813665] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] [ 1070.813665] env[67964]: INFO nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Terminating instance [ 1070.814507] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1070.814734] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1070.815438] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1070.815620] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1070.815840] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-08551843-3fbb-430f-b70f-e9698baaa9e0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.818211] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeda66f8-a52c-48a9-b1de-f64802c5eb29 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.825218] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1070.825433] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-302cb05c-a708-4dee-b844-4f075a1325f7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.827581] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1070.827758] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1070.828708] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c9d35f57-43ea-43e9-8754-f9f744275e44 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.833424] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Waiting for the task: (returnval){ [ 1070.833424] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52a48ce5-e6b3-b42e-9a09-dd6bfcf6e378" [ 1070.833424] env[67964]: _type = "Task" [ 1070.833424] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1070.847864] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52a48ce5-e6b3-b42e-9a09-dd6bfcf6e378, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1070.888365] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1070.888595] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1070.888770] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Deleting the datastore file [datastore1] 9c586d33-c563-45c7-8c54-1638a78a669c {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1070.889051] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b7876d2c-26fa-4acd-892c-7af566e412cc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.895065] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Waiting for the task: (returnval){ [ 1070.895065] env[67964]: value = "task-3456780" [ 1070.895065] env[67964]: _type = "Task" [ 1070.895065] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1070.903211] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Task: {'id': task-3456780, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1071.343954] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1071.344295] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Creating directory with path [datastore1] vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1071.344531] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a8745f13-586b-4bbb-bf82-c380dfffc6b2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.355619] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Created directory with path [datastore1] vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1071.355808] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Fetch image to [datastore1] vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1071.355972] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1071.356689] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dd5b992-4068-45aa-9cb7-170a717f7a7a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.363209] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90e534e8-ab2c-46aa-868d-3effe09fbf07 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.372218] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab06888-b84c-4842-a194-c4b603080897 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.405914] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fae92f4-27a2-4d16-a631-212191e09f48 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.414278] env[67964]: DEBUG oslo_vmware.api [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Task: {'id': task-3456780, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075883} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1071.414780] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1071.414967] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1071.415151] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1071.415327] env[67964]: INFO nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1071.417338] env[67964]: DEBUG nova.compute.claims [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1071.417510] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1071.417718] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1071.420716] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e8d82c2c-7394-4ba1-9738-83d42272820e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.444419] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1071.496757] env[67964]: DEBUG oslo_vmware.rw_handles [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1071.562359] env[67964]: DEBUG oslo_vmware.rw_handles [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1071.562507] env[67964]: DEBUG oslo_vmware.rw_handles [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1071.808923] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fdd83ea-90b2-41e3-9efe-b5e410e1ac8c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.817750] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dffb97bd-4ed6-45bc-b18b-f22eac6172b3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.848191] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-826147c6-caae-4d89-84e3-01704a24c1e2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.855709] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4ecf696-e27e-4e9e-8c67-9ece4796c131 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.868933] env[67964]: DEBUG nova.compute.provider_tree [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1071.879124] env[67964]: DEBUG nova.scheduler.client.report [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1071.895795] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.478s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1071.896661] env[67964]: ERROR nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1071.896661] env[67964]: Faults: ['InvalidArgument'] [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Traceback (most recent call last): [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self.driver.spawn(context, instance, image_meta, [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self._fetch_image_if_missing(context, vi) [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] image_cache(vi, tmp_image_ds_loc) [ 1071.896661] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] vm_util.copy_virtual_disk( [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] session._wait_for_task(vmdk_copy_task) [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] return self.wait_for_task(task_ref) [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] return evt.wait() [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] result = hub.switch() [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] return self.greenlet.switch() [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1071.896997] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] self.f(*self.args, **self.kw) [ 1071.897440] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1071.897440] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] raise exceptions.translate_fault(task_info.error) [ 1071.897440] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1071.897440] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Faults: ['InvalidArgument'] [ 1071.897440] env[67964]: ERROR nova.compute.manager [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] [ 1071.897440] env[67964]: DEBUG nova.compute.utils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1071.899099] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Build of instance 9c586d33-c563-45c7-8c54-1638a78a669c was re-scheduled: A specified parameter was not correct: fileType [ 1071.899099] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1071.899489] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1071.899661] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1071.899827] env[67964]: DEBUG nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1071.899984] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1072.254205] env[67964]: DEBUG nova.network.neutron [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1072.271673] env[67964]: INFO nova.compute.manager [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Took 0.37 seconds to deallocate network for instance. [ 1072.373753] env[67964]: INFO nova.scheduler.client.report [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Deleted allocations for instance 9c586d33-c563-45c7-8c54-1638a78a669c [ 1072.392200] env[67964]: DEBUG oslo_concurrency.lockutils [None req-446085fc-5d6e-41d4-aece-c9598b1920ae tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "9c586d33-c563-45c7-8c54-1638a78a669c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 480.548s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.393215] env[67964]: DEBUG oslo_concurrency.lockutils [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "9c586d33-c563-45c7-8c54-1638a78a669c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 280.788s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.393435] env[67964]: DEBUG oslo_concurrency.lockutils [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Acquiring lock "9c586d33-c563-45c7-8c54-1638a78a669c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1072.393638] env[67964]: DEBUG oslo_concurrency.lockutils [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "9c586d33-c563-45c7-8c54-1638a78a669c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.393801] env[67964]: DEBUG oslo_concurrency.lockutils [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "9c586d33-c563-45c7-8c54-1638a78a669c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.396624] env[67964]: INFO nova.compute.manager [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Terminating instance [ 1072.398329] env[67964]: DEBUG nova.compute.manager [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1072.398518] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1072.398760] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-edc72c5e-3847-4d53-835a-3cd9ffb02fa8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.408409] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-464c6a03-cfa0-4572-9ab6-8ac21b15102a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.421083] env[67964]: DEBUG nova.compute.manager [None req-8a1228d9-e9cd-4b75-a74e-e426aacd4e19 tempest-ServersNegativeTestJSON-1937387395 tempest-ServersNegativeTestJSON-1937387395-project-member] [instance: 09c05646-301a-4d74-957c-1c9c6b7ab44b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.441391] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9c586d33-c563-45c7-8c54-1638a78a669c could not be found. [ 1072.441597] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1072.441773] env[67964]: INFO nova.compute.manager [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1072.442016] env[67964]: DEBUG oslo.service.loopingcall [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1072.442246] env[67964]: DEBUG nova.compute.manager [-] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1072.442342] env[67964]: DEBUG nova.network.neutron [-] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1072.446040] env[67964]: DEBUG nova.compute.manager [None req-8a1228d9-e9cd-4b75-a74e-e426aacd4e19 tempest-ServersNegativeTestJSON-1937387395 tempest-ServersNegativeTestJSON-1937387395-project-member] [instance: 09c05646-301a-4d74-957c-1c9c6b7ab44b] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.466220] env[67964]: DEBUG nova.network.neutron [-] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1072.468283] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8a1228d9-e9cd-4b75-a74e-e426aacd4e19 tempest-ServersNegativeTestJSON-1937387395 tempest-ServersNegativeTestJSON-1937387395-project-member] Lock "09c05646-301a-4d74-957c-1c9c6b7ab44b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.390s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.474440] env[67964]: INFO nova.compute.manager [-] [instance: 9c586d33-c563-45c7-8c54-1638a78a669c] Took 0.03 seconds to deallocate network for instance. [ 1072.479031] env[67964]: DEBUG nova.compute.manager [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] [instance: f6aedef6-3d4d-4839-863b-771ac818a1c4] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.503083] env[67964]: DEBUG nova.compute.manager [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] [instance: f6aedef6-3d4d-4839-863b-771ac818a1c4] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.546296] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Lock "f6aedef6-3d4d-4839-863b-771ac818a1c4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.290s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.557459] env[67964]: DEBUG nova.compute.manager [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] [instance: 34481f0e-b35a-4405-be54-ac23326f1183] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.582462] env[67964]: DEBUG nova.compute.manager [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] [instance: 34481f0e-b35a-4405-be54-ac23326f1183] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.599975] env[67964]: DEBUG oslo_concurrency.lockutils [None req-741723e0-d37f-48f1-8bd7-8462d0b46433 tempest-ServerAddressesTestJSON-1247186641 tempest-ServerAddressesTestJSON-1247186641-project-member] Lock "9c586d33-c563-45c7-8c54-1638a78a669c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.207s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.609869] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Lock "34481f0e-b35a-4405-be54-ac23326f1183" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.317s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.619099] env[67964]: DEBUG nova.compute.manager [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] [instance: ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.643012] env[67964]: DEBUG nova.compute.manager [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] [instance: ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.666707] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c77294e0-0f39-4f49-9843-b4d106ef392d tempest-ListServersNegativeTestJSON-109683037 tempest-ListServersNegativeTestJSON-109683037-project-member] Lock "ae68e8fe-d3d6-4313-85d7-7e2fefa3a1ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.342s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.676188] env[67964]: DEBUG nova.compute.manager [None req-201f4a21-fd81-4562-9b01-167a88262704 tempest-FloatingIPsAssociationNegativeTestJSON-782376586 tempest-FloatingIPsAssociationNegativeTestJSON-782376586-project-member] [instance: ec330488-db38-486f-8d54-17afd9f07ce3] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.700271] env[67964]: DEBUG nova.compute.manager [None req-201f4a21-fd81-4562-9b01-167a88262704 tempest-FloatingIPsAssociationNegativeTestJSON-782376586 tempest-FloatingIPsAssociationNegativeTestJSON-782376586-project-member] [instance: ec330488-db38-486f-8d54-17afd9f07ce3] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.728191] env[67964]: DEBUG oslo_concurrency.lockutils [None req-201f4a21-fd81-4562-9b01-167a88262704 tempest-FloatingIPsAssociationNegativeTestJSON-782376586 tempest-FloatingIPsAssociationNegativeTestJSON-782376586-project-member] Lock "ec330488-db38-486f-8d54-17afd9f07ce3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.360s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.739129] env[67964]: DEBUG nova.compute.manager [None req-7aa99c10-9a4f-4b46-8fd6-e8a7def3c9bc tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: ae1668bc-04cb-4767-847a-d2b7c3d95156] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.763613] env[67964]: DEBUG nova.compute.manager [None req-7aa99c10-9a4f-4b46-8fd6-e8a7def3c9bc tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] [instance: ae1668bc-04cb-4767-847a-d2b7c3d95156] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.790244] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7aa99c10-9a4f-4b46-8fd6-e8a7def3c9bc tempest-MigrationsAdminTest-29435014 tempest-MigrationsAdminTest-29435014-project-member] Lock "ae1668bc-04cb-4767-847a-d2b7c3d95156" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.681s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.801667] env[67964]: DEBUG nova.compute.manager [None req-e4583d77-8828-4618-a6e8-afc30d5a2d0b tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] [instance: 8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.825491] env[67964]: DEBUG nova.compute.manager [None req-e4583d77-8828-4618-a6e8-afc30d5a2d0b tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] [instance: 8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.847460] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e4583d77-8828-4618-a6e8-afc30d5a2d0b tempest-SecurityGroupsTestJSON-63199574 tempest-SecurityGroupsTestJSON-63199574-project-member] Lock "8d9addd9-ce3d-4d41-9736-1c7ca0b9fbbe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.824s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.858615] env[67964]: DEBUG nova.compute.manager [None req-36d3e9b9-cc8a-443f-964b-22673d5c9f2d tempest-AttachInterfacesV270Test-1237935671 tempest-AttachInterfacesV270Test-1237935671-project-member] [instance: 7b96b0d5-a10c-4f7f-9113-46c85ea62dfe] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.899780] env[67964]: DEBUG nova.compute.manager [None req-36d3e9b9-cc8a-443f-964b-22673d5c9f2d tempest-AttachInterfacesV270Test-1237935671 tempest-AttachInterfacesV270Test-1237935671-project-member] [instance: 7b96b0d5-a10c-4f7f-9113-46c85ea62dfe] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.919941] env[67964]: DEBUG oslo_concurrency.lockutils [None req-36d3e9b9-cc8a-443f-964b-22673d5c9f2d tempest-AttachInterfacesV270Test-1237935671 tempest-AttachInterfacesV270Test-1237935671-project-member] Lock "7b96b0d5-a10c-4f7f-9113-46c85ea62dfe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.791s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.929143] env[67964]: DEBUG nova.compute.manager [None req-296520c0-1166-474c-8c01-fbcea84330c5 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] [instance: 94760699-7f13-42e2-abb2-45e3374eeccb] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1072.953397] env[67964]: DEBUG nova.compute.manager [None req-296520c0-1166-474c-8c01-fbcea84330c5 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] [instance: 94760699-7f13-42e2-abb2-45e3374eeccb] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1072.991455] env[67964]: DEBUG oslo_concurrency.lockutils [None req-296520c0-1166-474c-8c01-fbcea84330c5 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] Lock "94760699-7f13-42e2-abb2-45e3374eeccb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.348s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.000667] env[67964]: DEBUG nova.compute.manager [None req-7ad16c3c-0f7e-4ccb-8ec7-4eab967472a4 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] [instance: e239df07-066e-4dff-8302-9945a610a43a] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1073.023045] env[67964]: DEBUG nova.compute.manager [None req-7ad16c3c-0f7e-4ccb-8ec7-4eab967472a4 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] [instance: e239df07-066e-4dff-8302-9945a610a43a] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1073.043106] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7ad16c3c-0f7e-4ccb-8ec7-4eab967472a4 tempest-ServerRescueNegativeTestJSON-568068692 tempest-ServerRescueNegativeTestJSON-568068692-project-member] Lock "e239df07-066e-4dff-8302-9945a610a43a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.597s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.051689] env[67964]: DEBUG nova.compute.manager [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 809c38e0-bc92-4a77-b307-773b6df211c5] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1073.077974] env[67964]: DEBUG nova.compute.manager [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 809c38e0-bc92-4a77-b307-773b6df211c5] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1073.100605] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "809c38e0-bc92-4a77-b307-773b6df211c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.469s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.112151] env[67964]: DEBUG nova.compute.manager [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 18c8cc6b-a7aa-43fc-b048-1d788f4c162b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1073.136054] env[67964]: DEBUG nova.compute.manager [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] [instance: 18c8cc6b-a7aa-43fc-b048-1d788f4c162b] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1073.156361] env[67964]: DEBUG oslo_concurrency.lockutils [None req-aa658a66-dcf6-4e5e-ad7d-2bbc7f27d7bf tempest-MultipleCreateTestJSON-150964173 tempest-MultipleCreateTestJSON-150964173-project-member] Lock "18c8cc6b-a7aa-43fc-b048-1d788f4c162b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.497s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.165225] env[67964]: DEBUG nova.compute.manager [None req-96730b7e-0330-46b1-a456-bae29458dc8d tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: d90509f8-1957-4bb3-b4ec-eba8b37705b6] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1073.188944] env[67964]: DEBUG nova.compute.manager [None req-96730b7e-0330-46b1-a456-bae29458dc8d tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: d90509f8-1957-4bb3-b4ec-eba8b37705b6] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1073.208142] env[67964]: DEBUG oslo_concurrency.lockutils [None req-96730b7e-0330-46b1-a456-bae29458dc8d tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "d90509f8-1957-4bb3-b4ec-eba8b37705b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.595s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.216226] env[67964]: DEBUG nova.compute.manager [None req-0b38ccca-91fa-44c4-80bb-fb028048d55d tempest-ServerActionsV293TestJSON-564338386 tempest-ServerActionsV293TestJSON-564338386-project-member] [instance: dabdde79-50a8-43fd-a998-868aec05d825] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1073.237941] env[67964]: DEBUG nova.compute.manager [None req-0b38ccca-91fa-44c4-80bb-fb028048d55d tempest-ServerActionsV293TestJSON-564338386 tempest-ServerActionsV293TestJSON-564338386-project-member] [instance: dabdde79-50a8-43fd-a998-868aec05d825] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1073.257977] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0b38ccca-91fa-44c4-80bb-fb028048d55d tempest-ServerActionsV293TestJSON-564338386 tempest-ServerActionsV293TestJSON-564338386-project-member] Lock "dabdde79-50a8-43fd-a998-868aec05d825" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.280s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.265855] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1073.335084] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1073.335362] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1073.336816] env[67964]: INFO nova.compute.claims [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1073.630326] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73494fe9-e34e-4b3b-80ba-1156d67e6aec {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.637922] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df4df6e4-6226-41d7-bf90-5d022d26f4cc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.669178] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d43a4a2-a092-4612-bce9-24f3152f0c14 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.675976] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf08b39-6e3e-48a9-8873-7213267c52e0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.688725] env[67964]: DEBUG nova.compute.provider_tree [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1073.698832] env[67964]: DEBUG nova.scheduler.client.report [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1073.713672] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.714190] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1073.746448] env[67964]: DEBUG nova.compute.utils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1073.747941] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1073.748124] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1073.756763] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1073.823834] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1073.827253] env[67964]: DEBUG nova.policy [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcdece077e1c4d17b88c28e4fe9e43e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99e685d2f45a4811941a4e103fe03567', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1073.854878] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1073.855196] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1073.855437] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1073.855663] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1073.855828] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1073.856042] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1073.856898] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1073.857412] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1073.858009] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1073.858009] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1073.858351] env[67964]: DEBUG nova.virt.hardware [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1073.859009] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97b1172d-8300-414c-9f0c-f2f771505c0f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.867990] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-857bafe2-0ec2-4b92-a6db-e0c145e0b71a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.221582] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Successfully created port: f188f36f-a045-4f48-898a-de38b9206c72 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1074.783525] env[67964]: DEBUG nova.compute.manager [req-06dd115f-7946-4894-b5bb-ce11698c3ea0 req-4f9df89c-4070-4928-a916-d973e093a856 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Received event network-vif-plugged-f188f36f-a045-4f48-898a-de38b9206c72 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1074.783813] env[67964]: DEBUG oslo_concurrency.lockutils [req-06dd115f-7946-4894-b5bb-ce11698c3ea0 req-4f9df89c-4070-4928-a916-d973e093a856 service nova] Acquiring lock "67eb58c3-a895-4427-9197-3b0c731a123a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1074.784112] env[67964]: DEBUG oslo_concurrency.lockutils [req-06dd115f-7946-4894-b5bb-ce11698c3ea0 req-4f9df89c-4070-4928-a916-d973e093a856 service nova] Lock "67eb58c3-a895-4427-9197-3b0c731a123a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1074.784291] env[67964]: DEBUG oslo_concurrency.lockutils [req-06dd115f-7946-4894-b5bb-ce11698c3ea0 req-4f9df89c-4070-4928-a916-d973e093a856 service nova] Lock "67eb58c3-a895-4427-9197-3b0c731a123a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1074.784470] env[67964]: DEBUG nova.compute.manager [req-06dd115f-7946-4894-b5bb-ce11698c3ea0 req-4f9df89c-4070-4928-a916-d973e093a856 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] No waiting events found dispatching network-vif-plugged-f188f36f-a045-4f48-898a-de38b9206c72 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1074.784632] env[67964]: WARNING nova.compute.manager [req-06dd115f-7946-4894-b5bb-ce11698c3ea0 req-4f9df89c-4070-4928-a916-d973e093a856 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Received unexpected event network-vif-plugged-f188f36f-a045-4f48-898a-de38b9206c72 for instance with vm_state building and task_state spawning. [ 1074.877317] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Successfully updated port: f188f36f-a045-4f48-898a-de38b9206c72 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1074.891885] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "refresh_cache-67eb58c3-a895-4427-9197-3b0c731a123a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1074.892059] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquired lock "refresh_cache-67eb58c3-a895-4427-9197-3b0c731a123a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1074.892229] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1074.946765] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1075.352029] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Updating instance_info_cache with network_info: [{"id": "f188f36f-a045-4f48-898a-de38b9206c72", "address": "fa:16:3e:ca:47:0b", "network": {"id": "4b5a38e4-aee5-4c30-9f7d-06f31488ee91", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-2041448799-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "99e685d2f45a4811941a4e103fe03567", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20f072ba-9cfa-4ae8-a56c-d3082cbe6f5e", "external-id": "nsx-vlan-transportzone-594", "segmentation_id": 594, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf188f36f-a0", "ovs_interfaceid": "f188f36f-a045-4f48-898a-de38b9206c72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.364985] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Releasing lock "refresh_cache-67eb58c3-a895-4427-9197-3b0c731a123a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1075.365328] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Instance network_info: |[{"id": "f188f36f-a045-4f48-898a-de38b9206c72", "address": "fa:16:3e:ca:47:0b", "network": {"id": "4b5a38e4-aee5-4c30-9f7d-06f31488ee91", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-2041448799-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "99e685d2f45a4811941a4e103fe03567", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20f072ba-9cfa-4ae8-a56c-d3082cbe6f5e", "external-id": "nsx-vlan-transportzone-594", "segmentation_id": 594, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf188f36f-a0", "ovs_interfaceid": "f188f36f-a045-4f48-898a-de38b9206c72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1075.365762] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ca:47:0b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '20f072ba-9cfa-4ae8-a56c-d3082cbe6f5e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f188f36f-a045-4f48-898a-de38b9206c72', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1075.373664] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Creating folder: Project (99e685d2f45a4811941a4e103fe03567). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1075.373771] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-817843ab-9ea6-4345-bcc9-1c25fb199d23 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.385210] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Created folder: Project (99e685d2f45a4811941a4e103fe03567) in parent group-v690366. [ 1075.385394] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Creating folder: Instances. Parent ref: group-v690430. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1075.385614] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9c8f709f-615a-46df-924a-78e98e7417f0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.394411] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Created folder: Instances in parent group-v690430. [ 1075.394635] env[67964]: DEBUG oslo.service.loopingcall [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1075.394810] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1075.394999] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-549495ab-1ea7-45bc-9e12-9997c782f1b8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.414136] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1075.414136] env[67964]: value = "task-3456783" [ 1075.414136] env[67964]: _type = "Task" [ 1075.414136] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1075.424100] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456783, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1075.924669] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456783, 'name': CreateVM_Task, 'duration_secs': 0.268824} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1075.924995] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1075.925528] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1075.925694] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1075.926028] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1075.926280] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-58a49deb-1a29-46c5-b3ba-3f178cb483e0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.930458] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Waiting for the task: (returnval){ [ 1075.930458] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]526cd38b-50a6-edf6-24d4-fcf890435d00" [ 1075.930458] env[67964]: _type = "Task" [ 1075.930458] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1075.937828] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]526cd38b-50a6-edf6-24d4-fcf890435d00, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1076.440714] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1076.440976] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1076.441252] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1076.811759] env[67964]: DEBUG nova.compute.manager [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Received event network-changed-f188f36f-a045-4f48-898a-de38b9206c72 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1076.811927] env[67964]: DEBUG nova.compute.manager [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Refreshing instance network info cache due to event network-changed-f188f36f-a045-4f48-898a-de38b9206c72. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1076.812153] env[67964]: DEBUG oslo_concurrency.lockutils [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] Acquiring lock "refresh_cache-67eb58c3-a895-4427-9197-3b0c731a123a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1076.812295] env[67964]: DEBUG oslo_concurrency.lockutils [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] Acquired lock "refresh_cache-67eb58c3-a895-4427-9197-3b0c731a123a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1076.812450] env[67964]: DEBUG nova.network.neutron [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Refreshing network info cache for port f188f36f-a045-4f48-898a-de38b9206c72 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1077.060115] env[67964]: DEBUG nova.network.neutron [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Updated VIF entry in instance network info cache for port f188f36f-a045-4f48-898a-de38b9206c72. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1077.060501] env[67964]: DEBUG nova.network.neutron [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Updating instance_info_cache with network_info: [{"id": "f188f36f-a045-4f48-898a-de38b9206c72", "address": "fa:16:3e:ca:47:0b", "network": {"id": "4b5a38e4-aee5-4c30-9f7d-06f31488ee91", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-2041448799-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "99e685d2f45a4811941a4e103fe03567", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20f072ba-9cfa-4ae8-a56c-d3082cbe6f5e", "external-id": "nsx-vlan-transportzone-594", "segmentation_id": 594, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf188f36f-a0", "ovs_interfaceid": "f188f36f-a045-4f48-898a-de38b9206c72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1077.071022] env[67964]: DEBUG oslo_concurrency.lockutils [req-569c9535-b1e1-4716-ba59-3762c8538096 req-eb083a0e-2e34-4d44-90ba-24fbab738c64 service nova] Releasing lock "refresh_cache-67eb58c3-a895-4427-9197-3b0c731a123a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1078.800655] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1078.800980] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1078.812370] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.812592] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1078.812742] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1078.812899] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1078.814047] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53434cf3-e638-4b7d-be0e-95e2fffb746a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.822971] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d85e6c9a-359b-4a15-be9c-934022f90478 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.836753] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-184909f2-36ac-4db8-aae7-ac0966e7f526 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.842674] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75fc75b7-48fe-4394-9264-8dabe013ecfd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.870869] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180910MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1078.871057] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.871262] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1078.943394] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 6580c348-f5a4-4f20-a6fb-8942202a526e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.943556] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.943682] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.943807] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.943925] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.944057] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.944186] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.944314] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.944507] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.944642] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.957999] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.968991] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 4877dc66-8ac8-4f7e-9a49-97a7adb95e72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.978237] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.987141] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 57445f5b-8a3a-4d55-b926-ee2d3e24b6ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.996222] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 53718899-b65f-4e3b-a8d6-7277e946ab43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.005418] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 236faf76-d72e-4c2b-9b44-9d1866491310 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.014219] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f36ba9db-c547-4d77-9e49-24bfcc995e89 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.023162] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.032120] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fbf2ae36-60a6-48e2-b115-22b13b5c4cc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.040693] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 02b1d6da-0aa2-4199-a86a-fa5b197b2813 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.049283] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.057782] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 4cdc869e-2b97-4107-ae4d-49f99131048a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.067287] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7fe6f046-65c9-4464-931c-07e781c497aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1079.067287] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1079.067287] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1079.318328] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b523a13-a020-47ce-887a-35548018656d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.325951] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48afdff2-704c-4b6c-982c-782304814be0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.355210] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-627ef7c3-f59e-4afb-9ccf-b56db7383277 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.362545] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d47bc835-6370-4bb2-ba95-5775f025fd36 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.375452] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1079.383865] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1079.396728] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1079.396929] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.526s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1080.396482] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1080.800081] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1080.800338] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1080.800483] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1082.801098] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1083.795761] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1083.800378] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1084.800118] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1084.800387] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1084.800451] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1084.820765] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.820765] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.821039] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822025] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822025] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822025] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822025] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822025] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822355] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822355] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1084.822355] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1090.823663] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "67eb58c3-a895-4427-9197-3b0c731a123a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1117.516452] env[67964]: WARNING oslo_vmware.rw_handles [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1117.516452] env[67964]: ERROR oslo_vmware.rw_handles [ 1117.517364] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1117.519111] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1117.519423] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Copying Virtual Disk [datastore1] vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/736fe7f2-067b-4b51-bf27-3ef43156c882/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1117.519779] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f989518c-c9fd-4b17-b547-d8cd4a6be203 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1117.527285] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Waiting for the task: (returnval){ [ 1117.527285] env[67964]: value = "task-3456784" [ 1117.527285] env[67964]: _type = "Task" [ 1117.527285] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1117.535412] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Task: {'id': task-3456784, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1118.038034] env[67964]: DEBUG oslo_vmware.exceptions [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1118.038198] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1118.039029] env[67964]: ERROR nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1118.039029] env[67964]: Faults: ['InvalidArgument'] [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Traceback (most recent call last): [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] yield resources [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self.driver.spawn(context, instance, image_meta, [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self._fetch_image_if_missing(context, vi) [ 1118.039029] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] image_cache(vi, tmp_image_ds_loc) [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] vm_util.copy_virtual_disk( [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] session._wait_for_task(vmdk_copy_task) [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] return self.wait_for_task(task_ref) [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] return evt.wait() [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] result = hub.switch() [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1118.039503] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] return self.greenlet.switch() [ 1118.039852] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1118.039852] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self.f(*self.args, **self.kw) [ 1118.039852] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1118.039852] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] raise exceptions.translate_fault(task_info.error) [ 1118.039852] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1118.039852] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Faults: ['InvalidArgument'] [ 1118.039852] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] [ 1118.039852] env[67964]: INFO nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Terminating instance [ 1118.040509] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1118.040723] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1118.040969] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a3669ff2-709a-4482-ac96-9f169dfbfb93 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.042964] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1118.043133] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquired lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1118.043296] env[67964]: DEBUG nova.network.neutron [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1118.049949] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1118.050163] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1118.051308] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-76147c6c-f6f8-4860-ba4c-87e8e6085f84 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.058819] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Waiting for the task: (returnval){ [ 1118.058819] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52d3f293-7a0c-d4ab-ce5c-3237bbbfda86" [ 1118.058819] env[67964]: _type = "Task" [ 1118.058819] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1118.066264] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52d3f293-7a0c-d4ab-ce5c-3237bbbfda86, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1118.090675] env[67964]: DEBUG nova.network.neutron [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1118.192060] env[67964]: DEBUG nova.network.neutron [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1118.202302] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Releasing lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1118.202717] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1118.202905] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1118.204038] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0e6f3b1-9046-4bc6-b086-9402e448d7fd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.211825] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1118.212084] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d9080ee0-0792-4957-b17c-cbe5c9cde048 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.248473] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1118.248702] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1118.248877] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Deleting the datastore file [datastore1] 6580c348-f5a4-4f20-a6fb-8942202a526e {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1118.249141] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c335afb2-28e7-4b25-ba27-cd7ffed8ea34 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.254723] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Waiting for the task: (returnval){ [ 1118.254723] env[67964]: value = "task-3456786" [ 1118.254723] env[67964]: _type = "Task" [ 1118.254723] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1118.262422] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Task: {'id': task-3456786, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1118.570431] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1118.570746] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Creating directory with path [datastore1] vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1118.570952] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0e58b108-9a3a-42f0-b3a6-6ffea9352ac9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.581695] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Created directory with path [datastore1] vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1118.581879] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Fetch image to [datastore1] vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1118.582095] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1118.582819] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ca89cb1-9675-4f4a-a771-9ca7eeaabfd7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.589536] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fe26420-81ba-4d35-afd1-6296491029ca {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.598266] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d76e603-46f0-44e3-b310-1ff72804623a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.627987] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-affc08b2-94e7-4acc-8cdb-6b418e3e0e86 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.633544] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4f099ef2-e9f5-4807-959c-9b406395e5ea {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.653235] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1118.701874] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1118.761614] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1118.761795] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1118.765740] env[67964]: DEBUG oslo_vmware.api [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Task: {'id': task-3456786, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.03668} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1118.765979] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1118.766218] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1118.766343] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1118.766513] env[67964]: INFO nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1118.766735] env[67964]: DEBUG oslo.service.loopingcall [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1118.766939] env[67964]: DEBUG nova.compute.manager [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network deallocation for instance since networking was not requested. {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1118.769136] env[67964]: DEBUG nova.compute.claims [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1118.769275] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1118.769494] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1119.133416] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de23f99c-912a-4c9f-8e45-76454d821c02 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.141272] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce923ce-a56e-48d0-923b-559b48a1d531 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.171325] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9c70e9b-f91a-48e4-92b4-8ac29b2cf94d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.178797] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e578cb8e-12f9-444c-aa0f-3fbeeb7d2716 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.191734] env[67964]: DEBUG nova.compute.provider_tree [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1119.201599] env[67964]: DEBUG nova.scheduler.client.report [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1119.217552] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.448s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1119.218222] env[67964]: ERROR nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1119.218222] env[67964]: Faults: ['InvalidArgument'] [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Traceback (most recent call last): [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self.driver.spawn(context, instance, image_meta, [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self._fetch_image_if_missing(context, vi) [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] image_cache(vi, tmp_image_ds_loc) [ 1119.218222] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] vm_util.copy_virtual_disk( [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] session._wait_for_task(vmdk_copy_task) [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] return self.wait_for_task(task_ref) [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] return evt.wait() [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] result = hub.switch() [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] return self.greenlet.switch() [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1119.218545] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] self.f(*self.args, **self.kw) [ 1119.218867] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1119.218867] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] raise exceptions.translate_fault(task_info.error) [ 1119.218867] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1119.218867] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Faults: ['InvalidArgument'] [ 1119.218867] env[67964]: ERROR nova.compute.manager [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] [ 1119.219043] env[67964]: DEBUG nova.compute.utils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1119.220942] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Build of instance 6580c348-f5a4-4f20-a6fb-8942202a526e was re-scheduled: A specified parameter was not correct: fileType [ 1119.220942] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1119.221129] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1119.221378] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1119.221526] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquired lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1119.221690] env[67964]: DEBUG nova.network.neutron [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1119.255430] env[67964]: DEBUG nova.network.neutron [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1119.339369] env[67964]: DEBUG nova.network.neutron [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.349823] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Releasing lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1119.350076] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1119.350264] env[67964]: DEBUG nova.compute.manager [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Skipping network deallocation for instance since networking was not requested. {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1119.435860] env[67964]: INFO nova.scheduler.client.report [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Deleted allocations for instance 6580c348-f5a4-4f20-a6fb-8942202a526e [ 1119.452236] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ce03eff0-2615-4e22-adc8-21a678021c46 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "6580c348-f5a4-4f20-a6fb-8942202a526e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 520.840s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1119.453277] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "6580c348-f5a4-4f20-a6fb-8942202a526e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 322.767s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1119.453494] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "6580c348-f5a4-4f20-a6fb-8942202a526e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1119.453691] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "6580c348-f5a4-4f20-a6fb-8942202a526e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1119.453853] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "6580c348-f5a4-4f20-a6fb-8942202a526e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1119.455737] env[67964]: INFO nova.compute.manager [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Terminating instance [ 1119.457436] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquiring lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1119.457591] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Acquired lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1119.457753] env[67964]: DEBUG nova.network.neutron [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1119.466079] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1119.483383] env[67964]: DEBUG nova.network.neutron [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1119.516041] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1119.516323] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1119.517729] env[67964]: INFO nova.compute.claims [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1119.542319] env[67964]: DEBUG nova.network.neutron [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.550416] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Releasing lock "refresh_cache-6580c348-f5a4-4f20-a6fb-8942202a526e" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1119.550764] env[67964]: DEBUG nova.compute.manager [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1119.550953] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1119.551670] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6857d39f-ee98-47d4-b617-cfd998d1b98a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.562526] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8389fda-5840-4115-8c41-d5058df2a007 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.594774] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6580c348-f5a4-4f20-a6fb-8942202a526e could not be found. [ 1119.595110] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1119.595210] env[67964]: INFO nova.compute.manager [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1119.595396] env[67964]: DEBUG oslo.service.loopingcall [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1119.597831] env[67964]: DEBUG nova.compute.manager [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1119.597942] env[67964]: DEBUG nova.network.neutron [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1119.626335] env[67964]: DEBUG nova.network.neutron [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1119.635360] env[67964]: DEBUG nova.network.neutron [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.644728] env[67964]: INFO nova.compute.manager [-] [instance: 6580c348-f5a4-4f20-a6fb-8942202a526e] Took 0.05 seconds to deallocate network for instance. [ 1119.738655] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e3fdab3d-49f6-4347-8631-75bf74868758 tempest-ServerDiagnosticsV248Test-274850163 tempest-ServerDiagnosticsV248Test-274850163-project-member] Lock "6580c348-f5a4-4f20-a6fb-8942202a526e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.285s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1119.838613] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d62cad2-ba2d-4f82-aef3-b2b1e775c007 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.846551] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf2fc227-a81d-4c6f-a9e8-dff2b1e5d8c4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.876141] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bdd2de9-1e17-4817-82b8-dc062de5f34e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.883262] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17f54611-bacb-426e-89d6-9715488a3850 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.896610] env[67964]: DEBUG nova.compute.provider_tree [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1119.905847] env[67964]: DEBUG nova.scheduler.client.report [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1119.919690] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.403s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1119.920267] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1119.961811] env[67964]: DEBUG nova.compute.utils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1119.963271] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1119.963522] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1119.974344] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1120.024030] env[67964]: DEBUG nova.policy [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4e6ba44caba1469ebfe63dc193fdd8bb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02a615a0d5824671beedaa49cd2ebf3e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1120.036341] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1120.061075] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1120.061075] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1120.061253] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1120.061376] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1120.061750] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1120.061750] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1120.061894] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1120.062062] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1120.062223] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1120.062390] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1120.062541] env[67964]: DEBUG nova.virt.hardware [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1120.063438] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e83ecb54-4e29-48d3-9155-a61f2c249846 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.071862] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42044abb-e2d6-433f-9f55-dd49fc1cfb7d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.619894] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Successfully created port: caa3714b-33d7-4bac-91b1-48aefd91be03 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1121.194135] env[67964]: DEBUG nova.compute.manager [req-222509f8-9c69-4fb3-8254-b9fde4c2678e req-6435e9ff-5163-4ffa-9dd8-380317bcf42e service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Received event network-vif-plugged-caa3714b-33d7-4bac-91b1-48aefd91be03 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1121.194379] env[67964]: DEBUG oslo_concurrency.lockutils [req-222509f8-9c69-4fb3-8254-b9fde4c2678e req-6435e9ff-5163-4ffa-9dd8-380317bcf42e service nova] Acquiring lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1121.194589] env[67964]: DEBUG oslo_concurrency.lockutils [req-222509f8-9c69-4fb3-8254-b9fde4c2678e req-6435e9ff-5163-4ffa-9dd8-380317bcf42e service nova] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1121.194753] env[67964]: DEBUG oslo_concurrency.lockutils [req-222509f8-9c69-4fb3-8254-b9fde4c2678e req-6435e9ff-5163-4ffa-9dd8-380317bcf42e service nova] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1121.194917] env[67964]: DEBUG nova.compute.manager [req-222509f8-9c69-4fb3-8254-b9fde4c2678e req-6435e9ff-5163-4ffa-9dd8-380317bcf42e service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] No waiting events found dispatching network-vif-plugged-caa3714b-33d7-4bac-91b1-48aefd91be03 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1121.195284] env[67964]: WARNING nova.compute.manager [req-222509f8-9c69-4fb3-8254-b9fde4c2678e req-6435e9ff-5163-4ffa-9dd8-380317bcf42e service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Received unexpected event network-vif-plugged-caa3714b-33d7-4bac-91b1-48aefd91be03 for instance with vm_state building and task_state spawning. [ 1121.270915] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Successfully updated port: caa3714b-33d7-4bac-91b1-48aefd91be03 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1121.287922] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "refresh_cache-09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1121.287922] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquired lock "refresh_cache-09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1121.287922] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1121.324868] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1121.527770] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Updating instance_info_cache with network_info: [{"id": "caa3714b-33d7-4bac-91b1-48aefd91be03", "address": "fa:16:3e:68:d7:83", "network": {"id": "96d00d7a-a497-44d4-bb6b-0bdc6264389f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-604765958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "02a615a0d5824671beedaa49cd2ebf3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fb143ef7-8271-4a8a-a4aa-8eba9a89f6a1", "external-id": "nsx-vlan-transportzone-504", "segmentation_id": 504, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcaa3714b-33", "ovs_interfaceid": "caa3714b-33d7-4bac-91b1-48aefd91be03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1121.546277] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Releasing lock "refresh_cache-09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1121.546646] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Instance network_info: |[{"id": "caa3714b-33d7-4bac-91b1-48aefd91be03", "address": "fa:16:3e:68:d7:83", "network": {"id": "96d00d7a-a497-44d4-bb6b-0bdc6264389f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-604765958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "02a615a0d5824671beedaa49cd2ebf3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fb143ef7-8271-4a8a-a4aa-8eba9a89f6a1", "external-id": "nsx-vlan-transportzone-504", "segmentation_id": 504, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcaa3714b-33", "ovs_interfaceid": "caa3714b-33d7-4bac-91b1-48aefd91be03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1121.547569] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:68:d7:83', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fb143ef7-8271-4a8a-a4aa-8eba9a89f6a1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'caa3714b-33d7-4bac-91b1-48aefd91be03', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1121.554751] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Creating folder: Project (02a615a0d5824671beedaa49cd2ebf3e). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1121.555375] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aeb19d5d-6135-423e-9aaf-e7f155dd38d9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.565875] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Created folder: Project (02a615a0d5824671beedaa49cd2ebf3e) in parent group-v690366. [ 1121.566071] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Creating folder: Instances. Parent ref: group-v690433. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1121.566304] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-003730c0-74ad-4c08-a1ec-beeaa047fc47 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.575116] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Created folder: Instances in parent group-v690433. [ 1121.575313] env[67964]: DEBUG oslo.service.loopingcall [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1121.575496] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1121.575687] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cb63edc7-eac3-4126-aea2-e605e7305d91 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.593790] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1121.593790] env[67964]: value = "task-3456789" [ 1121.593790] env[67964]: _type = "Task" [ 1121.593790] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1121.601097] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456789, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1122.103886] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456789, 'name': CreateVM_Task, 'duration_secs': 0.306206} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1122.104295] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1122.104914] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1122.105215] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1122.105546] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1122.105797] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-20bd4332-096f-4de9-b99f-d5b3b6e38b45 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.110408] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Waiting for the task: (returnval){ [ 1122.110408] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52815667-d431-beea-bfb6-d6a3f5bc2567" [ 1122.110408] env[67964]: _type = "Task" [ 1122.110408] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1122.117757] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52815667-d431-beea-bfb6-d6a3f5bc2567, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1122.622312] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1122.622562] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1122.622911] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1123.223197] env[67964]: DEBUG nova.compute.manager [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Received event network-changed-caa3714b-33d7-4bac-91b1-48aefd91be03 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1123.223451] env[67964]: DEBUG nova.compute.manager [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Refreshing instance network info cache due to event network-changed-caa3714b-33d7-4bac-91b1-48aefd91be03. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1123.223627] env[67964]: DEBUG oslo_concurrency.lockutils [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] Acquiring lock "refresh_cache-09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1123.223746] env[67964]: DEBUG oslo_concurrency.lockutils [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] Acquired lock "refresh_cache-09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1123.223886] env[67964]: DEBUG nova.network.neutron [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Refreshing network info cache for port caa3714b-33d7-4bac-91b1-48aefd91be03 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1123.481612] env[67964]: DEBUG nova.network.neutron [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Updated VIF entry in instance network info cache for port caa3714b-33d7-4bac-91b1-48aefd91be03. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1123.481995] env[67964]: DEBUG nova.network.neutron [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Updating instance_info_cache with network_info: [{"id": "caa3714b-33d7-4bac-91b1-48aefd91be03", "address": "fa:16:3e:68:d7:83", "network": {"id": "96d00d7a-a497-44d4-bb6b-0bdc6264389f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-604765958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "02a615a0d5824671beedaa49cd2ebf3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fb143ef7-8271-4a8a-a4aa-8eba9a89f6a1", "external-id": "nsx-vlan-transportzone-504", "segmentation_id": 504, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcaa3714b-33", "ovs_interfaceid": "caa3714b-33d7-4bac-91b1-48aefd91be03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1123.491477] env[67964]: DEBUG oslo_concurrency.lockutils [req-b57ad004-d126-49a2-be47-e1dd3aa2375d req-1c85b9bd-27d8-4ea3-88df-faf0d7e43cdc service nova] Releasing lock "refresh_cache-09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1134.080247] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.800627] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1134.800811] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 1134.809679] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] There are 0 instances to clean {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 1137.800967] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1139.134211] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "18c148fb-1cd4-4537-9b77-089e9b272f83" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.134528] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1139.810428] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1139.822419] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.822669] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1139.822837] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1139.822993] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1139.824154] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c380afc-6062-4cc7-9ad7-cb3f514d5dc3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.835806] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bbcb633-a3b8-4c7f-a1a8-3ad7566a01b6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.852217] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e43c0992-e475-4328-8311-7c8d5bf9a343 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.858903] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec0511c5-ad93-4fb3-8211-0f94f4b833c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.890818] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180916MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1139.890991] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.891209] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1139.965231] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fed6991c-9b59-43bb-8cda-96053adb798b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.965396] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.965550] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.965679] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.965806] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.965907] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.966040] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.966152] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.966262] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.966374] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1139.977353] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1139.988845] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 57445f5b-8a3a-4d55-b926-ee2d3e24b6ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1139.998927] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 53718899-b65f-4e3b-a8d6-7277e946ab43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.009826] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 236faf76-d72e-4c2b-9b44-9d1866491310 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.018753] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f36ba9db-c547-4d77-9e49-24bfcc995e89 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.028712] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.038660] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fbf2ae36-60a6-48e2-b115-22b13b5c4cc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.048833] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 02b1d6da-0aa2-4199-a86a-fa5b197b2813 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.061840] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.070754] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 4cdc869e-2b97-4107-ae4d-49f99131048a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.080604] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7fe6f046-65c9-4464-931c-07e781c497aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.091014] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1140.091264] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1140.091428] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1140.334552] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d5bbcb-bfd9-4af1-ba9b-a0f6c052a2c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.342199] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6705971-c659-4bec-9a93-f4738f3052c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.373144] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1cb1852-6749-402a-bfcc-1840d1cc34ca {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.380605] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8504cfce-e8e1-4c00-aad3-d0b28031cc2a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.393659] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1140.403384] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1140.418266] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1140.418509] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.527s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1140.418770] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1140.418935] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances with incomplete migration {{(pid=67964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 1141.417594] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1141.417872] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1141.418098] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1141.800638] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1142.800843] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1144.796050] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1144.800121] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1144.800295] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1146.800560] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1146.800885] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1146.801022] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1146.821522] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.821678] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.821807] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.821932] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.822067] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.822192] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.822311] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.822426] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.822539] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.822656] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1146.822772] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1149.817600] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1159.672625] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1159.695795] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Getting list of instances from cluster (obj){ [ 1159.695795] env[67964]: value = "domain-c8" [ 1159.695795] env[67964]: _type = "ClusterComputeResource" [ 1159.695795] env[67964]: } {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1159.697182] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd4f820-9f29-41b9-998a-c20a58964fee {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.715235] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Got total of 10 instances {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1159.715409] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid fed6991c-9b59-43bb-8cda-96053adb798b {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.715598] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 707828f6-0267-42ff-95e5-6b328382b017 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.715777] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 0768fe80-7dd3-42ec-8e22-42a6aece5bef {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.715949] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 9e47d3ce-3897-458b-ac85-d98745e9aeb5 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.716116] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid ea492fb8-2352-436c-a7d5-f20423f4d353 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.716269] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid c648c89a-ca70-4a15-9083-0cbe9e5bee23 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.716416] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 9793d383-9033-4f86-b7bb-6b2e43347cd6 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.716568] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 5fbee4c3-bc7c-4582-b976-b0d619a69cdb {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.716713] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 67eb58c3-a895-4427-9197-3b0c731a123a {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.716854] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1159.717189] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "fed6991c-9b59-43bb-8cda-96053adb798b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.717421] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "707828f6-0267-42ff-95e5-6b328382b017" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.717618] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.717810] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.717998] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "ea492fb8-2352-436c-a7d5-f20423f4d353" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.718198] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.718384] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.718766] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.718996] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "67eb58c3-a895-4427-9197-3b0c731a123a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.719209] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1167.536418] env[67964]: WARNING oslo_vmware.rw_handles [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1167.536418] env[67964]: ERROR oslo_vmware.rw_handles [ 1167.537391] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1167.539119] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1167.539389] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Copying Virtual Disk [datastore1] vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/d7f16a9a-8e1f-493f-9033-d5e8bc0c4091/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1167.539680] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-938852d2-50e4-41ce-9aee-9604e1707246 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.548464] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Waiting for the task: (returnval){ [ 1167.548464] env[67964]: value = "task-3456790" [ 1167.548464] env[67964]: _type = "Task" [ 1167.548464] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1167.556781] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Task: {'id': task-3456790, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.059368] env[67964]: DEBUG oslo_vmware.exceptions [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1168.059651] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1168.060230] env[67964]: ERROR nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1168.060230] env[67964]: Faults: ['InvalidArgument'] [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Traceback (most recent call last): [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] yield resources [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self.driver.spawn(context, instance, image_meta, [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self._fetch_image_if_missing(context, vi) [ 1168.060230] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] image_cache(vi, tmp_image_ds_loc) [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] vm_util.copy_virtual_disk( [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] session._wait_for_task(vmdk_copy_task) [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] return self.wait_for_task(task_ref) [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] return evt.wait() [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] result = hub.switch() [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1168.060596] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] return self.greenlet.switch() [ 1168.060951] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1168.060951] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self.f(*self.args, **self.kw) [ 1168.060951] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1168.060951] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] raise exceptions.translate_fault(task_info.error) [ 1168.060951] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1168.060951] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Faults: ['InvalidArgument'] [ 1168.060951] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] [ 1168.060951] env[67964]: INFO nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Terminating instance [ 1168.062123] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1168.062331] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1168.062574] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c57d8042-6fb1-4b9a-8859-3f94e34f1342 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.064785] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1168.065048] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1168.065710] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9c532fc-09f7-4285-8e80-0687859e78a5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.072676] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1168.072886] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9332b1aa-65d4-4169-88d5-ef45a29e0b5b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.075085] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1168.075261] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1168.076210] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-478aed95-a4c1-47fd-a22c-6f575c06d9e5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.080651] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Waiting for the task: (returnval){ [ 1168.080651] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52157493-c5de-a692-762c-4f2d2485dd99" [ 1168.080651] env[67964]: _type = "Task" [ 1168.080651] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1168.087814] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52157493-c5de-a692-762c-4f2d2485dd99, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.147101] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1168.147342] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1168.147516] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Deleting the datastore file [datastore1] fed6991c-9b59-43bb-8cda-96053adb798b {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1168.147783] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-88af2750-be61-4bcd-8233-2abc6c84e2a0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.153815] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Waiting for the task: (returnval){ [ 1168.153815] env[67964]: value = "task-3456792" [ 1168.153815] env[67964]: _type = "Task" [ 1168.153815] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1168.161886] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Task: {'id': task-3456792, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.590743] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1168.591035] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Creating directory with path [datastore1] vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1168.591252] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aa990a7f-cf48-43e1-80a4-875d5b21e864 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.603547] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Created directory with path [datastore1] vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1168.603769] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Fetch image to [datastore1] vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1168.603939] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1168.604698] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bf038c6-6498-44de-a6eb-b82672f0d0e5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.611506] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0caaecc0-805b-42da-8e2d-ea538e6911a3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.620778] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aff4d88e-9270-4090-b6f5-9dd824bcf50d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.651664] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7acaaec0-2689-4e06-a747-50c9e4747efd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.662852] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-48b21a82-df2c-4a4e-afa3-ffd1797e160a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.664557] env[67964]: DEBUG oslo_vmware.api [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Task: {'id': task-3456792, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069595} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1168.664798] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1168.665018] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1168.665209] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1168.665390] env[67964]: INFO nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1168.667482] env[67964]: DEBUG nova.compute.claims [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1168.667645] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.667852] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1168.687137] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1168.739615] env[67964]: DEBUG oslo_vmware.rw_handles [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1168.798666] env[67964]: DEBUG oslo_vmware.rw_handles [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1168.798850] env[67964]: DEBUG oslo_vmware.rw_handles [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1169.008380] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c48af9e-0fc3-4213-a310-bd4b9686c076 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.016263] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d33f7c8-0a26-48fe-afad-fab76b24646e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.046079] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-486ace65-14e6-4f9e-adee-d3b5411d1e5e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.053187] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8547f0e0-b0e8-4d2f-a41c-f44fd4b186e1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.066760] env[67964]: DEBUG nova.compute.provider_tree [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1169.075103] env[67964]: DEBUG nova.scheduler.client.report [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1169.089030] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.420s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.089030] env[67964]: ERROR nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1169.089030] env[67964]: Faults: ['InvalidArgument'] [ 1169.089030] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Traceback (most recent call last): [ 1169.089030] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1169.089030] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self.driver.spawn(context, instance, image_meta, [ 1169.089030] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1169.089030] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1169.089030] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1169.089030] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self._fetch_image_if_missing(context, vi) [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] image_cache(vi, tmp_image_ds_loc) [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] vm_util.copy_virtual_disk( [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] session._wait_for_task(vmdk_copy_task) [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] return self.wait_for_task(task_ref) [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] return evt.wait() [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] result = hub.switch() [ 1169.089571] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] return self.greenlet.switch() [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] self.f(*self.args, **self.kw) [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] raise exceptions.translate_fault(task_info.error) [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Faults: ['InvalidArgument'] [ 1169.090129] env[67964]: ERROR nova.compute.manager [instance: fed6991c-9b59-43bb-8cda-96053adb798b] [ 1169.090129] env[67964]: DEBUG nova.compute.utils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1169.090985] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Build of instance fed6991c-9b59-43bb-8cda-96053adb798b was re-scheduled: A specified parameter was not correct: fileType [ 1169.090985] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1169.091381] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1169.091549] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1169.091715] env[67964]: DEBUG nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1169.091875] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1169.493975] env[67964]: DEBUG nova.network.neutron [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1169.509440] env[67964]: INFO nova.compute.manager [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Took 0.42 seconds to deallocate network for instance. [ 1169.602776] env[67964]: INFO nova.scheduler.client.report [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Deleted allocations for instance fed6991c-9b59-43bb-8cda-96053adb798b [ 1169.624198] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a50dfa45-e14c-44e0-8b6e-cd81fdeec8c9 tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "fed6991c-9b59-43bb-8cda-96053adb798b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 570.422s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.625596] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "fed6991c-9b59-43bb-8cda-96053adb798b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 372.020s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1169.626306] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Acquiring lock "fed6991c-9b59-43bb-8cda-96053adb798b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1169.626306] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "fed6991c-9b59-43bb-8cda-96053adb798b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1169.626306] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "fed6991c-9b59-43bb-8cda-96053adb798b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.628064] env[67964]: INFO nova.compute.manager [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Terminating instance [ 1169.629894] env[67964]: DEBUG nova.compute.manager [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1169.630088] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1169.630563] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eba7559a-2e28-4156-b12a-c029478fe376 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.636780] env[67964]: DEBUG nova.compute.manager [None req-01951162-df92-4a21-8fe9-8c5624bd2208 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 4877dc66-8ac8-4f7e-9a49-97a7adb95e72] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1169.645053] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0473f9ce-9e0e-4f68-af69-cc8b08c163da {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.661825] env[67964]: DEBUG nova.compute.manager [None req-01951162-df92-4a21-8fe9-8c5624bd2208 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 4877dc66-8ac8-4f7e-9a49-97a7adb95e72] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1169.672834] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fed6991c-9b59-43bb-8cda-96053adb798b could not be found. [ 1169.673046] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1169.673229] env[67964]: INFO nova.compute.manager [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1169.673461] env[67964]: DEBUG oslo.service.loopingcall [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1169.673666] env[67964]: DEBUG nova.compute.manager [-] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1169.673977] env[67964]: DEBUG nova.network.neutron [-] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1169.690024] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01951162-df92-4a21-8fe9-8c5624bd2208 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "4877dc66-8ac8-4f7e-9a49-97a7adb95e72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.442s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.704020] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1169.714645] env[67964]: DEBUG nova.network.neutron [-] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1169.725014] env[67964]: INFO nova.compute.manager [-] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] Took 0.05 seconds to deallocate network for instance. [ 1169.754811] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1169.755128] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1169.756665] env[67964]: INFO nova.compute.claims [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1169.824837] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bf83786f-1361-4017-9def-68771e1eb59c tempest-ServerPasswordTestJSON-731895431 tempest-ServerPasswordTestJSON-731895431-project-member] Lock "fed6991c-9b59-43bb-8cda-96053adb798b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.825205] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "fed6991c-9b59-43bb-8cda-96053adb798b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 10.108s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1169.825383] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: fed6991c-9b59-43bb-8cda-96053adb798b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1169.825559] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "fed6991c-9b59-43bb-8cda-96053adb798b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1170.048450] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7028a5b-e832-43e6-b136-7dc90b37d97e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.055790] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24534d4b-dbe5-435d-983d-b71c630a4eb2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.084399] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-747e2e73-b9cb-475c-afb3-a4019eac69b3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.091094] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-342494cc-cf7c-4c3b-b31a-0bc8a6f972a8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.103702] env[67964]: DEBUG nova.compute.provider_tree [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1170.112971] env[67964]: DEBUG nova.scheduler.client.report [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1170.125460] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1170.125931] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1170.162056] env[67964]: DEBUG nova.compute.utils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1170.163983] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1170.164274] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1170.172800] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1170.241587] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1170.254081] env[67964]: DEBUG nova.policy [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68954824009043edb8c280efef2a782f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c5861e63d37045fba770c534732fe26d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1170.265621] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1170.265740] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1170.265863] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1170.266110] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1170.266268] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1170.266410] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1170.266612] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1170.266766] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1170.266933] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1170.267105] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1170.267279] env[67964]: DEBUG nova.virt.hardware [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1170.268416] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c62205-7e84-45f8-8432-ced95fa04541 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.275871] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74ad268-711a-486b-b155-8604224888b7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.756787] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Successfully created port: 4a9bf15e-628a-47f4-bcc6-2154a9d93f8c {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1171.590101] env[67964]: DEBUG nova.compute.manager [req-8efcbe3e-e0b8-42d6-ac54-9a4da32679bf req-3033285f-fccc-4ff1-b16d-84f05bfadc8b service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Received event network-vif-plugged-4a9bf15e-628a-47f4-bcc6-2154a9d93f8c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1171.590101] env[67964]: DEBUG oslo_concurrency.lockutils [req-8efcbe3e-e0b8-42d6-ac54-9a4da32679bf req-3033285f-fccc-4ff1-b16d-84f05bfadc8b service nova] Acquiring lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1171.590101] env[67964]: DEBUG oslo_concurrency.lockutils [req-8efcbe3e-e0b8-42d6-ac54-9a4da32679bf req-3033285f-fccc-4ff1-b16d-84f05bfadc8b service nova] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1171.590101] env[67964]: DEBUG oslo_concurrency.lockutils [req-8efcbe3e-e0b8-42d6-ac54-9a4da32679bf req-3033285f-fccc-4ff1-b16d-84f05bfadc8b service nova] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1171.590471] env[67964]: DEBUG nova.compute.manager [req-8efcbe3e-e0b8-42d6-ac54-9a4da32679bf req-3033285f-fccc-4ff1-b16d-84f05bfadc8b service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] No waiting events found dispatching network-vif-plugged-4a9bf15e-628a-47f4-bcc6-2154a9d93f8c {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1171.590471] env[67964]: WARNING nova.compute.manager [req-8efcbe3e-e0b8-42d6-ac54-9a4da32679bf req-3033285f-fccc-4ff1-b16d-84f05bfadc8b service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Received unexpected event network-vif-plugged-4a9bf15e-628a-47f4-bcc6-2154a9d93f8c for instance with vm_state building and task_state spawning. [ 1171.608846] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Successfully updated port: 4a9bf15e-628a-47f4-bcc6-2154a9d93f8c {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1171.628996] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "refresh_cache-d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1171.628996] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquired lock "refresh_cache-d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1171.628996] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1171.680882] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1171.934148] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Updating instance_info_cache with network_info: [{"id": "4a9bf15e-628a-47f4-bcc6-2154a9d93f8c", "address": "fa:16:3e:78:0c:25", "network": {"id": "cc9b4ae3-819a-4f81-8604-def748bacb4a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-703629903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c5861e63d37045fba770c534732fe26d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b2ede0e6-8d7a-4018-bb37-25bf388e9867", "external-id": "nsx-vlan-transportzone-945", "segmentation_id": 945, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4a9bf15e-62", "ovs_interfaceid": "4a9bf15e-628a-47f4-bcc6-2154a9d93f8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1171.956223] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Releasing lock "refresh_cache-d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1171.956645] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Instance network_info: |[{"id": "4a9bf15e-628a-47f4-bcc6-2154a9d93f8c", "address": "fa:16:3e:78:0c:25", "network": {"id": "cc9b4ae3-819a-4f81-8604-def748bacb4a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-703629903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c5861e63d37045fba770c534732fe26d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b2ede0e6-8d7a-4018-bb37-25bf388e9867", "external-id": "nsx-vlan-transportzone-945", "segmentation_id": 945, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4a9bf15e-62", "ovs_interfaceid": "4a9bf15e-628a-47f4-bcc6-2154a9d93f8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1171.956941] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:0c:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b2ede0e6-8d7a-4018-bb37-25bf388e9867', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4a9bf15e-628a-47f4-bcc6-2154a9d93f8c', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1171.964365] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Creating folder: Project (c5861e63d37045fba770c534732fe26d). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1171.964894] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-151f6d4d-a4e0-4dbe-96cd-d2ac74fa8f3e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.974449] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Created folder: Project (c5861e63d37045fba770c534732fe26d) in parent group-v690366. [ 1171.974635] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Creating folder: Instances. Parent ref: group-v690436. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1171.974854] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b7a81d00-bd30-496a-89b3-3b09b2772d7e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.982445] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Created folder: Instances in parent group-v690436. [ 1171.982670] env[67964]: DEBUG oslo.service.loopingcall [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1171.982842] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1171.983081] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-36b06443-96b9-4a05-a170-619c8070f26f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.000734] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1172.000734] env[67964]: value = "task-3456795" [ 1172.000734] env[67964]: _type = "Task" [ 1172.000734] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1172.008657] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456795, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1172.511294] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456795, 'name': CreateVM_Task, 'duration_secs': 0.270461} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1172.511462] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1172.512117] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1172.512284] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1172.512588] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1172.512826] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-03991e06-0ca6-4c3a-8c68-8a1434a26508 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.517129] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Waiting for the task: (returnval){ [ 1172.517129] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5290f03d-a525-c9d3-ad75-bfb8b3e997a7" [ 1172.517129] env[67964]: _type = "Task" [ 1172.517129] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1172.524572] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5290f03d-a525-c9d3-ad75-bfb8b3e997a7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1173.027354] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1173.027639] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1173.027845] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1173.618226] env[67964]: DEBUG nova.compute.manager [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Received event network-changed-4a9bf15e-628a-47f4-bcc6-2154a9d93f8c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1173.618361] env[67964]: DEBUG nova.compute.manager [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Refreshing instance network info cache due to event network-changed-4a9bf15e-628a-47f4-bcc6-2154a9d93f8c. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1173.618572] env[67964]: DEBUG oslo_concurrency.lockutils [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] Acquiring lock "refresh_cache-d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1173.618709] env[67964]: DEBUG oslo_concurrency.lockutils [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] Acquired lock "refresh_cache-d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1173.618864] env[67964]: DEBUG nova.network.neutron [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Refreshing network info cache for port 4a9bf15e-628a-47f4-bcc6-2154a9d93f8c {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1173.965752] env[67964]: DEBUG nova.network.neutron [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Updated VIF entry in instance network info cache for port 4a9bf15e-628a-47f4-bcc6-2154a9d93f8c. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1173.966119] env[67964]: DEBUG nova.network.neutron [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Updating instance_info_cache with network_info: [{"id": "4a9bf15e-628a-47f4-bcc6-2154a9d93f8c", "address": "fa:16:3e:78:0c:25", "network": {"id": "cc9b4ae3-819a-4f81-8604-def748bacb4a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-703629903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c5861e63d37045fba770c534732fe26d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b2ede0e6-8d7a-4018-bb37-25bf388e9867", "external-id": "nsx-vlan-transportzone-945", "segmentation_id": 945, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4a9bf15e-62", "ovs_interfaceid": "4a9bf15e-628a-47f4-bcc6-2154a9d93f8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1173.975750] env[67964]: DEBUG oslo_concurrency.lockutils [req-c59bd045-996e-48a0-a0fd-07a5e8cd7076 req-da5fd706-aca9-4e87-bb45-8282a216a51d service nova] Releasing lock "refresh_cache-d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1181.351908] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.606207] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "18d6df82-a19a-499a-8874-171218569651" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.606542] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "18d6df82-a19a-499a-8874-171218569651" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.971807] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.972095] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1198.395630] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "7825ba9e-8603-4211-b5fe-708276272464" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1198.395630] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "7825ba9e-8603-4211-b5fe-708276272464" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.800488] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1199.816514] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.816754] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.816920] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1199.817089] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1199.818612] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b315fe1-2cbd-4206-948a-873dbc1adf7d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.828089] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3855ce2-3acb-4fe8-bb23-81a6d30cd273 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.845690] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-161d9da6-57bf-4628-9821-e953b103ccfd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.853441] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5647bf7f-2459-4a35-ac9e-db4eb8e747ec {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.897267] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180918MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1199.897534] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.897846] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1200.065187] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 707828f6-0267-42ff-95e5-6b328382b017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065187] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065187] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065187] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065384] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065384] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065384] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065384] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065559] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.065559] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1200.075351] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance f36ba9db-c547-4d77-9e49-24bfcc995e89 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.087160] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.099966] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fbf2ae36-60a6-48e2-b115-22b13b5c4cc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.112057] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 02b1d6da-0aa2-4199-a86a-fa5b197b2813 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.121887] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.139049] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 4cdc869e-2b97-4107-ae4d-49f99131048a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.150592] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7fe6f046-65c9-4464-931c-07e781c497aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.161192] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.171321] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.183971] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.198800] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.198800] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1200.198800] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1200.221412] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing inventories for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:818}} [ 1200.242975] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating ProviderTree inventory for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:782}} [ 1200.242975] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1200.256816] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing aggregate associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, aggregates: None {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:827}} [ 1200.285372] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing trait associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:839}} [ 1200.622051] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6197199a-0d4d-4014-9271-4cb1b5ec25e2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.630221] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f950af8-b0b8-4728-b4e9-b3f1d59b1067 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.660165] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c645da8-e0ce-40a3-bfa1-447a9ff8be41 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.667404] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3e91817-a405-4906-bf76-7743f7643449 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.680286] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1200.691247] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1200.708123] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1200.708341] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.810s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.226653] env[67964]: DEBUG oslo_concurrency.lockutils [None req-530a2849-71f7-4865-a1c6-446fda5b7ea7 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1201.226653] env[67964]: DEBUG oslo_concurrency.lockutils [None req-530a2849-71f7-4865-a1c6-446fda5b7ea7 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1201.708609] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1201.708796] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1201.800733] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1202.800432] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1202.800837] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1204.795926] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1204.799920] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1204.800413] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1205.497127] env[67964]: DEBUG oslo_concurrency.lockutils [None req-be1d3380-6677-4de9-a32e-a3485a81bc8d tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "fb025130-d995-4615-8dee-59af1700877f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1205.497541] env[67964]: DEBUG oslo_concurrency.lockutils [None req-be1d3380-6677-4de9-a32e-a3485a81bc8d tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "fb025130-d995-4615-8dee-59af1700877f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.804048] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1206.804048] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1206.804048] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1206.837448] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.837448] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.837791] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.838152] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.838456] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.838721] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.838992] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.840089] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.840089] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.840089] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1206.840089] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1207.698689] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5a3516b0-0f68-4acd-9a5d-02535f0c84bc tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] Acquiring lock "60cd7925-3124-449e-8d27-4faa7b27cb9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1207.698951] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5a3516b0-0f68-4acd-9a5d-02535f0c84bc tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] Lock "60cd7925-3124-449e-8d27-4faa7b27cb9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1215.392627] env[67964]: WARNING oslo_vmware.rw_handles [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1215.392627] env[67964]: ERROR oslo_vmware.rw_handles [ 1215.393308] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1215.394975] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1215.395238] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Copying Virtual Disk [datastore1] vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/35f76b0b-e93f-4889-a4b9-dc78b85436d2/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1215.395520] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cfcb9621-d75b-4f5b-bc4f-bda43b1a7d1c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.404116] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Waiting for the task: (returnval){ [ 1215.404116] env[67964]: value = "task-3456796" [ 1215.404116] env[67964]: _type = "Task" [ 1215.404116] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1215.411694] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Task: {'id': task-3456796, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1215.914777] env[67964]: DEBUG oslo_vmware.exceptions [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1215.915092] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1215.915682] env[67964]: ERROR nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1215.915682] env[67964]: Faults: ['InvalidArgument'] [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] Traceback (most recent call last): [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] yield resources [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self.driver.spawn(context, instance, image_meta, [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self._fetch_image_if_missing(context, vi) [ 1215.915682] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] image_cache(vi, tmp_image_ds_loc) [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] vm_util.copy_virtual_disk( [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] session._wait_for_task(vmdk_copy_task) [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] return self.wait_for_task(task_ref) [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] return evt.wait() [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] result = hub.switch() [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1215.916152] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] return self.greenlet.switch() [ 1215.916585] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1215.916585] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self.f(*self.args, **self.kw) [ 1215.916585] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1215.916585] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] raise exceptions.translate_fault(task_info.error) [ 1215.916585] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1215.916585] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] Faults: ['InvalidArgument'] [ 1215.916585] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] [ 1215.916585] env[67964]: INFO nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Terminating instance [ 1215.917524] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1215.917728] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1215.918364] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1215.918548] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1215.918762] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d1f1870-afa6-4ced-b9f1-d4d6e982bfe2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.921016] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1eff4da7-de35-4c51-9851-e0264b442747 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.927877] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1215.928095] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-38f1b579-f615-4e7a-b864-7715a54006b9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.930258] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1215.930464] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1215.931355] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b0a2dcad-de37-46dd-8131-9c4ac4ce74fd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.936252] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Waiting for the task: (returnval){ [ 1215.936252] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]521f51e1-cd0a-577b-9a9b-b97b15d373f9" [ 1215.936252] env[67964]: _type = "Task" [ 1215.936252] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1215.943414] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]521f51e1-cd0a-577b-9a9b-b97b15d373f9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1215.998061] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1215.998061] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1215.998061] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Deleting the datastore file [datastore1] 707828f6-0267-42ff-95e5-6b328382b017 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1215.998061] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7ddd6c0e-45b8-4338-b72e-c793437480f2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.004083] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Waiting for the task: (returnval){ [ 1216.004083] env[67964]: value = "task-3456798" [ 1216.004083] env[67964]: _type = "Task" [ 1216.004083] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1216.011543] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Task: {'id': task-3456798, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1216.446406] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1216.446724] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Creating directory with path [datastore1] vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1216.446895] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ccf6f29a-882f-486f-b92b-6edd8279bb25 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.458539] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Created directory with path [datastore1] vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1216.458737] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Fetch image to [datastore1] vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1216.458924] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1216.459622] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-813c2d98-ef30-4cea-8054-20dcfbc0c173 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.466089] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f87b499-b3e1-4728-a173-6f9a43a94e8e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.474805] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71aadceb-620b-463a-8789-532770c9542a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.505022] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95bcb569-6e47-4e86-aede-1590fe66654a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.516572] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-99cae0b7-e251-4f09-b16e-f20f8e53c4e2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.518221] env[67964]: DEBUG oslo_vmware.api [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Task: {'id': task-3456798, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082171} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1216.518448] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1216.518625] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1216.518812] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1216.519037] env[67964]: INFO nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1216.521067] env[67964]: DEBUG nova.compute.claims [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1216.521253] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1216.521458] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1216.545348] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1216.733520] env[67964]: DEBUG oslo_vmware.rw_handles [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1216.800778] env[67964]: DEBUG oslo_vmware.rw_handles [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1216.801078] env[67964]: DEBUG oslo_vmware.rw_handles [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1216.912152] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba19f265-a7e1-4c67-b4d9-d0cfb922f475 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.919538] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeb13ce0-615c-42e0-85fb-f348abef5843 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.948947] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba67670f-5635-4aea-8b97-da91b6b09e1e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.955990] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f93aeaae-3d02-4303-9ca6-bae1624b1418 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.968589] env[67964]: DEBUG nova.compute.provider_tree [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1216.977173] env[67964]: DEBUG nova.scheduler.client.report [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1216.990549] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.469s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1216.991100] env[67964]: ERROR nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1216.991100] env[67964]: Faults: ['InvalidArgument'] [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] Traceback (most recent call last): [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self.driver.spawn(context, instance, image_meta, [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self._fetch_image_if_missing(context, vi) [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] image_cache(vi, tmp_image_ds_loc) [ 1216.991100] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] vm_util.copy_virtual_disk( [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] session._wait_for_task(vmdk_copy_task) [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] return self.wait_for_task(task_ref) [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] return evt.wait() [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] result = hub.switch() [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] return self.greenlet.switch() [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1216.991488] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] self.f(*self.args, **self.kw) [ 1216.991877] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1216.991877] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] raise exceptions.translate_fault(task_info.error) [ 1216.991877] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1216.991877] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] Faults: ['InvalidArgument'] [ 1216.991877] env[67964]: ERROR nova.compute.manager [instance: 707828f6-0267-42ff-95e5-6b328382b017] [ 1216.991877] env[67964]: DEBUG nova.compute.utils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1216.993182] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Build of instance 707828f6-0267-42ff-95e5-6b328382b017 was re-scheduled: A specified parameter was not correct: fileType [ 1216.993182] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1216.993548] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1216.993766] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1216.993968] env[67964]: DEBUG nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1216.994209] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1217.356275] env[67964]: DEBUG nova.network.neutron [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1217.371297] env[67964]: INFO nova.compute.manager [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Took 0.38 seconds to deallocate network for instance. [ 1217.465674] env[67964]: INFO nova.scheduler.client.report [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Deleted allocations for instance 707828f6-0267-42ff-95e5-6b328382b017 [ 1217.488751] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3272aa79-45b6-4b42-81ea-7a5e968440f4 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "707828f6-0267-42ff-95e5-6b328382b017" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 615.762s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.490046] env[67964]: DEBUG oslo_concurrency.lockutils [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "707828f6-0267-42ff-95e5-6b328382b017" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 418.312s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.490610] env[67964]: DEBUG oslo_concurrency.lockutils [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Acquiring lock "707828f6-0267-42ff-95e5-6b328382b017-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1217.490845] env[67964]: DEBUG oslo_concurrency.lockutils [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "707828f6-0267-42ff-95e5-6b328382b017-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.491029] env[67964]: DEBUG oslo_concurrency.lockutils [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "707828f6-0267-42ff-95e5-6b328382b017-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.493174] env[67964]: INFO nova.compute.manager [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Terminating instance [ 1217.495017] env[67964]: DEBUG nova.compute.manager [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1217.495262] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1217.495947] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-babab2b6-481c-4a36-a222-d57e5c4fbdbf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.505029] env[67964]: DEBUG nova.compute.manager [None req-a9a03e68-2cc5-4370-b58f-05f3d7670428 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 57445f5b-8a3a-4d55-b926-ee2d3e24b6ce] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1217.510061] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1105fb3c-5257-4eb6-88d1-9f9febf0cf97 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.531031] env[67964]: DEBUG nova.compute.manager [None req-a9a03e68-2cc5-4370-b58f-05f3d7670428 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 57445f5b-8a3a-4d55-b926-ee2d3e24b6ce] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1217.541747] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 707828f6-0267-42ff-95e5-6b328382b017 could not be found. [ 1217.541858] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1217.541977] env[67964]: INFO nova.compute.manager [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1217.542235] env[67964]: DEBUG oslo.service.loopingcall [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1217.542624] env[67964]: DEBUG nova.compute.manager [-] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1217.542725] env[67964]: DEBUG nova.network.neutron [-] [instance: 707828f6-0267-42ff-95e5-6b328382b017] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1217.558765] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a9a03e68-2cc5-4370-b58f-05f3d7670428 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "57445f5b-8a3a-4d55-b926-ee2d3e24b6ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.955s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.566984] env[67964]: DEBUG nova.network.neutron [-] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1217.568312] env[67964]: DEBUG nova.compute.manager [None req-a0e5a5de-4a9a-4103-b8c2-447de0f46a3a tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 53718899-b65f-4e3b-a8d6-7277e946ab43] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1217.573961] env[67964]: INFO nova.compute.manager [-] [instance: 707828f6-0267-42ff-95e5-6b328382b017] Took 0.03 seconds to deallocate network for instance. [ 1217.590009] env[67964]: DEBUG nova.compute.manager [None req-a0e5a5de-4a9a-4103-b8c2-447de0f46a3a tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] [instance: 53718899-b65f-4e3b-a8d6-7277e946ab43] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1217.610319] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a0e5a5de-4a9a-4103-b8c2-447de0f46a3a tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "53718899-b65f-4e3b-a8d6-7277e946ab43" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.719s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.624232] env[67964]: DEBUG nova.compute.manager [None req-349d2e60-f63a-4842-8fb9-4995dfb65b9c tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 236faf76-d72e-4c2b-9b44-9d1866491310] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1217.647759] env[67964]: DEBUG nova.compute.manager [None req-349d2e60-f63a-4842-8fb9-4995dfb65b9c tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 236faf76-d72e-4c2b-9b44-9d1866491310] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1217.668711] env[67964]: DEBUG oslo_concurrency.lockutils [None req-245249e1-0137-4a64-b3d5-90551b6e0434 tempest-VolumesAdminNegativeTest-684699916 tempest-VolumesAdminNegativeTest-684699916-project-member] Lock "707828f6-0267-42ff-95e5-6b328382b017" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.669796] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "707828f6-0267-42ff-95e5-6b328382b017" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 57.952s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.670363] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 707828f6-0267-42ff-95e5-6b328382b017] During sync_power_state the instance has a pending task (deleting). Skip. [ 1217.670557] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "707828f6-0267-42ff-95e5-6b328382b017" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.676912] env[67964]: DEBUG oslo_concurrency.lockutils [None req-349d2e60-f63a-4842-8fb9-4995dfb65b9c tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "236faf76-d72e-4c2b-9b44-9d1866491310" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.205s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.689153] env[67964]: DEBUG nova.compute.manager [None req-fe2ae47e-5133-4c60-b3e2-6249def18f17 tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] [instance: f36ba9db-c547-4d77-9e49-24bfcc995e89] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1217.711849] env[67964]: DEBUG nova.compute.manager [None req-fe2ae47e-5133-4c60-b3e2-6249def18f17 tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] [instance: f36ba9db-c547-4d77-9e49-24bfcc995e89] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1217.734143] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fe2ae47e-5133-4c60-b3e2-6249def18f17 tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] Lock "f36ba9db-c547-4d77-9e49-24bfcc995e89" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.354s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.747329] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1217.797587] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1217.797843] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.799402] env[67964]: INFO nova.compute.claims [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1218.095371] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03ee4d03-b9bb-429c-9404-d70f1a08a089 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.103365] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c46060cf-b826-4fec-b870-e659b9b3efdf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.132958] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8db05c16-7029-4b0f-8dd5-b695306884d7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.140104] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8695180e-727c-4e29-92d0-aceec5151e8f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.153693] env[67964]: DEBUG nova.compute.provider_tree [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1218.162819] env[67964]: DEBUG nova.scheduler.client.report [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1218.176327] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1218.176923] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1218.212354] env[67964]: DEBUG nova.compute.utils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1218.213986] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1218.214172] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1218.223875] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1218.279534] env[67964]: DEBUG nova.policy [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7373f7b862cc4f43a074101da040ac07', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30050a5e509146ea87e6a86263ba0f59', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1218.306070] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1218.330894] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1218.331161] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1218.331318] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1218.331497] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1218.331639] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1218.331784] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1218.331987] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1218.332160] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1218.332340] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1218.332546] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1218.332731] env[67964]: DEBUG nova.virt.hardware [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1218.333579] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ace08a47-1daa-4b3d-a26b-894d7f72d2fb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.343794] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02a63255-b430-4d7d-b6f1-5914c6640a90 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.652064] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Successfully created port: 39503df5-bc9b-4267-b747-c13f9701fd73 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1219.390424] env[67964]: DEBUG nova.compute.manager [req-11798670-abf1-4227-b3f6-7a3b1e38cbad req-77791cae-73fd-443b-836a-7b765bad1f95 service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Received event network-vif-plugged-39503df5-bc9b-4267-b747-c13f9701fd73 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1219.390644] env[67964]: DEBUG oslo_concurrency.lockutils [req-11798670-abf1-4227-b3f6-7a3b1e38cbad req-77791cae-73fd-443b-836a-7b765bad1f95 service nova] Acquiring lock "9cd7ef82-147a-4303-a773-32b161f819ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1219.390848] env[67964]: DEBUG oslo_concurrency.lockutils [req-11798670-abf1-4227-b3f6-7a3b1e38cbad req-77791cae-73fd-443b-836a-7b765bad1f95 service nova] Lock "9cd7ef82-147a-4303-a773-32b161f819ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1219.391018] env[67964]: DEBUG oslo_concurrency.lockutils [req-11798670-abf1-4227-b3f6-7a3b1e38cbad req-77791cae-73fd-443b-836a-7b765bad1f95 service nova] Lock "9cd7ef82-147a-4303-a773-32b161f819ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1219.391178] env[67964]: DEBUG nova.compute.manager [req-11798670-abf1-4227-b3f6-7a3b1e38cbad req-77791cae-73fd-443b-836a-7b765bad1f95 service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] No waiting events found dispatching network-vif-plugged-39503df5-bc9b-4267-b747-c13f9701fd73 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1219.391333] env[67964]: WARNING nova.compute.manager [req-11798670-abf1-4227-b3f6-7a3b1e38cbad req-77791cae-73fd-443b-836a-7b765bad1f95 service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Received unexpected event network-vif-plugged-39503df5-bc9b-4267-b747-c13f9701fd73 for instance with vm_state building and task_state spawning. [ 1219.411657] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Successfully updated port: 39503df5-bc9b-4267-b747-c13f9701fd73 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1219.421760] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "refresh_cache-9cd7ef82-147a-4303-a773-32b161f819ef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1219.421955] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "refresh_cache-9cd7ef82-147a-4303-a773-32b161f819ef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1219.422121] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1219.467955] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1219.677201] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Updating instance_info_cache with network_info: [{"id": "39503df5-bc9b-4267-b747-c13f9701fd73", "address": "fa:16:3e:29:6a:8f", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap39503df5-bc", "ovs_interfaceid": "39503df5-bc9b-4267-b747-c13f9701fd73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1219.693330] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "refresh_cache-9cd7ef82-147a-4303-a773-32b161f819ef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1219.693676] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Instance network_info: |[{"id": "39503df5-bc9b-4267-b747-c13f9701fd73", "address": "fa:16:3e:29:6a:8f", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap39503df5-bc", "ovs_interfaceid": "39503df5-bc9b-4267-b747-c13f9701fd73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1219.694135] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:6a:8f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b107fab-ee71-47db-ad4d-3c6f05546843', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '39503df5-bc9b-4267-b747-c13f9701fd73', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1219.702188] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating folder: Project (30050a5e509146ea87e6a86263ba0f59). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1219.702772] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-41c2461f-6f17-40ce-9239-0e4b4af93531 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.713225] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created folder: Project (30050a5e509146ea87e6a86263ba0f59) in parent group-v690366. [ 1219.713426] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating folder: Instances. Parent ref: group-v690439. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1219.713757] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-79812804-5f10-480f-8a69-92ce5f4418bf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.722349] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created folder: Instances in parent group-v690439. [ 1219.722586] env[67964]: DEBUG oslo.service.loopingcall [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1219.722769] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1219.722971] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3a9a1459-667f-49ad-bd27-03b2078ff5e3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.741570] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1219.741570] env[67964]: value = "task-3456801" [ 1219.741570] env[67964]: _type = "Task" [ 1219.741570] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1219.749308] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456801, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1220.251960] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456801, 'name': CreateVM_Task, 'duration_secs': 0.313368} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1220.252360] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1220.259136] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1220.259314] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1220.259626] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1220.259870] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-922c2d9f-2f0b-41e7-998e-b811154d3ac2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.264459] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 1220.264459] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]529ce013-e1c7-a3a4-9f0b-a38c99560132" [ 1220.264459] env[67964]: _type = "Task" [ 1220.264459] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1220.272955] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]529ce013-e1c7-a3a4-9f0b-a38c99560132, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1220.775241] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1220.775559] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1220.775672] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1221.417531] env[67964]: DEBUG nova.compute.manager [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Received event network-changed-39503df5-bc9b-4267-b747-c13f9701fd73 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1221.417660] env[67964]: DEBUG nova.compute.manager [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Refreshing instance network info cache due to event network-changed-39503df5-bc9b-4267-b747-c13f9701fd73. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1221.417844] env[67964]: DEBUG oslo_concurrency.lockutils [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] Acquiring lock "refresh_cache-9cd7ef82-147a-4303-a773-32b161f819ef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1221.417972] env[67964]: DEBUG oslo_concurrency.lockutils [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] Acquired lock "refresh_cache-9cd7ef82-147a-4303-a773-32b161f819ef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1221.418266] env[67964]: DEBUG nova.network.neutron [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Refreshing network info cache for port 39503df5-bc9b-4267-b747-c13f9701fd73 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1221.456493] env[67964]: DEBUG oslo_concurrency.lockutils [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "9cd7ef82-147a-4303-a773-32b161f819ef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1221.936410] env[67964]: DEBUG nova.network.neutron [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Updated VIF entry in instance network info cache for port 39503df5-bc9b-4267-b747-c13f9701fd73. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1221.936771] env[67964]: DEBUG nova.network.neutron [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Updating instance_info_cache with network_info: [{"id": "39503df5-bc9b-4267-b747-c13f9701fd73", "address": "fa:16:3e:29:6a:8f", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap39503df5-bc", "ovs_interfaceid": "39503df5-bc9b-4267-b747-c13f9701fd73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1221.945789] env[67964]: DEBUG oslo_concurrency.lockutils [req-52964cd8-feda-4a97-9335-d59856139c97 req-3ee676ba-426c-456b-a162-b20d9c8e3f4a service nova] Releasing lock "refresh_cache-9cd7ef82-147a-4303-a773-32b161f819ef" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1233.293522] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "ec783231-6f62-4177-ba76-4ba688dda077" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1233.293817] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "ec783231-6f62-4177-ba76-4ba688dda077" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1243.445027] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fbf6107f-c6e6-4dca-a4dd-58e7adfb9f53 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] Acquiring lock "0f126555-f26e-42da-a468-28a28887c901" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1243.445411] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fbf6107f-c6e6-4dca-a4dd-58e7adfb9f53 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] Lock "0f126555-f26e-42da-a468-28a28887c901" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1243.902034] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9a8db661-4469-40ee-abe6-9f08886943f1 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] Acquiring lock "0228456f-0055-43b9-9a81-e0f031e2a549" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1243.902494] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9a8db661-4469-40ee-abe6-9f08886943f1 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] Lock "0228456f-0055-43b9-9a81-e0f031e2a549" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1248.112959] env[67964]: DEBUG oslo_concurrency.lockutils [None req-b9df27ba-4726-4edc-809c-64e894beadff tempest-ServerMetadataNegativeTestJSON-1124299271 tempest-ServerMetadataNegativeTestJSON-1124299271-project-member] Acquiring lock "a0908e14-521d-42c1-baaa-b5863b1f142d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.113350] env[67964]: DEBUG oslo_concurrency.lockutils [None req-b9df27ba-4726-4edc-809c-64e894beadff tempest-ServerMetadataNegativeTestJSON-1124299271 tempest-ServerMetadataNegativeTestJSON-1124299271-project-member] Lock "a0908e14-521d-42c1-baaa-b5863b1f142d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.135860] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2fdf58fe-e78e-4265-a4b0-303fc616d9d4 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "d64969c7-d467-4958-8b04-aa2d2920769a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1254.136289] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2fdf58fe-e78e-4265-a4b0-303fc616d9d4 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "d64969c7-d467-4958-8b04-aa2d2920769a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1261.800613] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1261.800894] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1261.801143] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1261.812582] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1261.812798] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1261.812965] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1261.813135] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1261.814209] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7b97e2-ba9c-4c37-ba11-87fe41312457 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.823216] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24a04d64-8de8-45b4-8ab1-99a8230c30ff {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.838164] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d3733a6-08ef-4a9a-9765-2b7646dc67bd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.845032] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad8b889f-0641-428a-82f6-277747ea7517 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.872614] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180789MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1261.872753] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1261.872936] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1261.948406] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.948565] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.948691] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.948813] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.948929] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.949061] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.949182] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.949296] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.949409] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.949521] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.959953] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.969709] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.979842] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.990035] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.999465] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.009173] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fb025130-d995-4615-8dee-59af1700877f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.019289] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 60cd7925-3124-449e-8d27-4faa7b27cb9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.028563] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.038570] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0f126555-f26e-42da-a468-28a28887c901 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.048326] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0228456f-0055-43b9-9a81-e0f031e2a549 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.058502] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance a0908e14-521d-42c1-baaa-b5863b1f142d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.068158] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d64969c7-d467-4958-8b04-aa2d2920769a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1262.068385] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1262.068536] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1262.317336] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20c1a3df-4145-472b-9980-c61942c597ba {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.324930] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d34e3eaf-6e2e-464c-8fc9-4bfc6ba278cf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.354383] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b728f19a-0053-4c3c-8066-26d1dfdab390 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.361360] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63f9f9d5-be17-41d7-80b9-39b14bae045e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.374369] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1262.382721] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1262.398205] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1262.398388] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.525s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1263.398536] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1263.398806] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1263.398953] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1264.455193] env[67964]: WARNING oslo_vmware.rw_handles [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1264.455193] env[67964]: ERROR oslo_vmware.rw_handles [ 1264.455685] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1264.458207] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1264.458478] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Copying Virtual Disk [datastore1] vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/3b8bc28c-34e7-495d-b380-60dda9f910fc/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1264.458770] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-be50999f-769b-425a-8021-9184260c2f91 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.466192] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Waiting for the task: (returnval){ [ 1264.466192] env[67964]: value = "task-3456802" [ 1264.466192] env[67964]: _type = "Task" [ 1264.466192] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1264.474072] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Task: {'id': task-3456802, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1264.978048] env[67964]: DEBUG oslo_vmware.exceptions [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1264.978048] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1264.978659] env[67964]: ERROR nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1264.978659] env[67964]: Faults: ['InvalidArgument'] [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Traceback (most recent call last): [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] yield resources [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self.driver.spawn(context, instance, image_meta, [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self._fetch_image_if_missing(context, vi) [ 1264.978659] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] image_cache(vi, tmp_image_ds_loc) [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] vm_util.copy_virtual_disk( [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] session._wait_for_task(vmdk_copy_task) [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] return self.wait_for_task(task_ref) [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] return evt.wait() [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] result = hub.switch() [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1264.978969] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] return self.greenlet.switch() [ 1264.979517] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1264.979517] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self.f(*self.args, **self.kw) [ 1264.979517] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1264.979517] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] raise exceptions.translate_fault(task_info.error) [ 1264.979517] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1264.979517] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Faults: ['InvalidArgument'] [ 1264.979517] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] [ 1264.979517] env[67964]: INFO nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Terminating instance [ 1264.980508] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1264.980727] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1264.981391] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1264.981574] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1264.981789] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bab540c3-10b9-4d3c-90e6-5b2001b80e15 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.984059] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b91afe6d-3736-4d48-a616-f614c83e8403 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.990606] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1264.990810] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0601f750-93dc-4e5d-b0a8-b77629f12f91 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.992909] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1264.993093] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1264.993980] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-786a8c86-f8af-43c4-a3e1-447b57c30eff {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.998464] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Waiting for the task: (returnval){ [ 1264.998464] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]524d60c4-8746-f90a-0e71-86170170ee09" [ 1264.998464] env[67964]: _type = "Task" [ 1264.998464] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1265.005461] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]524d60c4-8746-f90a-0e71-86170170ee09, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1265.056833] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1265.057062] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1265.057231] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Deleting the datastore file [datastore1] 0768fe80-7dd3-42ec-8e22-42a6aece5bef {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1265.057491] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-52e16997-81d5-4726-8404-ddb77aec1f2c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.063740] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Waiting for the task: (returnval){ [ 1265.063740] env[67964]: value = "task-3456804" [ 1265.063740] env[67964]: _type = "Task" [ 1265.063740] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1265.070894] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Task: {'id': task-3456804, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1265.508532] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1265.508785] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Creating directory with path [datastore1] vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1265.509115] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f9388ccd-8bb9-4c8a-b95a-51a1c8beeb59 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.520220] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Created directory with path [datastore1] vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1265.520409] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Fetch image to [datastore1] vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1265.520573] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1265.521318] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9fbd19c-edf0-4f5f-8740-7a9305142843 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.527658] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f921f8a-bcc8-41f1-804e-638d1b821284 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.536219] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1633c4b-d657-4e79-9862-9b4faeccdb7e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.568763] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c298c67-0488-4b2b-a24a-48c2f643ccde {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.575358] env[67964]: DEBUG oslo_vmware.api [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Task: {'id': task-3456804, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065903} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1265.576736] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1265.576930] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1265.577114] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1265.577287] env[67964]: INFO nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1265.579078] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2d494285-c082-44b5-ae70-d96aec2a868e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.581834] env[67964]: DEBUG nova.compute.claims [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1265.582010] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1265.582227] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1265.599315] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1265.758023] env[67964]: DEBUG oslo_vmware.rw_handles [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1265.812579] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1265.817338] env[67964]: DEBUG oslo_vmware.rw_handles [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1265.817517] env[67964]: DEBUG oslo_vmware.rw_handles [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1265.949643] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-801a8cad-5370-4dab-8b06-e216fae27c11 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.956851] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b20d86e-bf46-41a4-93ff-b42d33fe78da {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.985859] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c7c5f8a-dd2b-4a61-8a21-c5cecefc0503 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.993449] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-307d68c0-e1c6-4e31-a7af-cc9f5bf1fe00 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.006299] env[67964]: DEBUG nova.compute.provider_tree [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1266.015579] env[67964]: DEBUG nova.scheduler.client.report [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1266.029481] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.447s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.029898] env[67964]: ERROR nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1266.029898] env[67964]: Faults: ['InvalidArgument'] [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Traceback (most recent call last): [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self.driver.spawn(context, instance, image_meta, [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self._fetch_image_if_missing(context, vi) [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] image_cache(vi, tmp_image_ds_loc) [ 1266.029898] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] vm_util.copy_virtual_disk( [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] session._wait_for_task(vmdk_copy_task) [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] return self.wait_for_task(task_ref) [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] return evt.wait() [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] result = hub.switch() [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] return self.greenlet.switch() [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1266.030200] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] self.f(*self.args, **self.kw) [ 1266.030457] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1266.030457] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] raise exceptions.translate_fault(task_info.error) [ 1266.030457] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1266.030457] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Faults: ['InvalidArgument'] [ 1266.030457] env[67964]: ERROR nova.compute.manager [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] [ 1266.030750] env[67964]: DEBUG nova.compute.utils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1266.031967] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Build of instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef was re-scheduled: A specified parameter was not correct: fileType [ 1266.031967] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1266.032363] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1266.032533] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1266.032699] env[67964]: DEBUG nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1266.032856] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1266.356707] env[67964]: DEBUG nova.network.neutron [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1266.369146] env[67964]: INFO nova.compute.manager [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Took 0.34 seconds to deallocate network for instance. [ 1266.472936] env[67964]: INFO nova.scheduler.client.report [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Deleted allocations for instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef [ 1266.495639] env[67964]: DEBUG oslo_concurrency.lockutils [None req-91edc0b0-4075-41db-9bb9-f15a72f186b7 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 657.886s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.496777] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 453.697s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.497029] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Acquiring lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.497231] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.497394] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.499645] env[67964]: INFO nova.compute.manager [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Terminating instance [ 1266.501968] env[67964]: DEBUG nova.compute.manager [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1266.501968] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1266.502852] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f4bdb6a8-8551-435d-b0bf-bf3582d35f9f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.514673] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9303b0a1-23f0-42a9-a774-0a9097411925 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.527595] env[67964]: DEBUG nova.compute.manager [None req-58718077-6acc-4c85-9126-d2d0e0a2a01e tempest-ServerAddressesNegativeTestJSON-188398278 tempest-ServerAddressesNegativeTestJSON-188398278-project-member] [instance: fbf2ae36-60a6-48e2-b115-22b13b5c4cc2] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1266.549384] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0768fe80-7dd3-42ec-8e22-42a6aece5bef could not be found. [ 1266.549620] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1266.550570] env[67964]: INFO nova.compute.manager [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1266.550570] env[67964]: DEBUG oslo.service.loopingcall [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1266.550570] env[67964]: DEBUG nova.compute.manager [-] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1266.550570] env[67964]: DEBUG nova.network.neutron [-] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1266.552824] env[67964]: DEBUG nova.compute.manager [None req-58718077-6acc-4c85-9126-d2d0e0a2a01e tempest-ServerAddressesNegativeTestJSON-188398278 tempest-ServerAddressesNegativeTestJSON-188398278-project-member] [instance: fbf2ae36-60a6-48e2-b115-22b13b5c4cc2] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1266.574373] env[67964]: DEBUG oslo_concurrency.lockutils [None req-58718077-6acc-4c85-9126-d2d0e0a2a01e tempest-ServerAddressesNegativeTestJSON-188398278 tempest-ServerAddressesNegativeTestJSON-188398278-project-member] Lock "fbf2ae36-60a6-48e2-b115-22b13b5c4cc2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.406s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.579513] env[67964]: DEBUG nova.network.neutron [-] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1266.585723] env[67964]: DEBUG nova.compute.manager [None req-2a223a76-9979-4324-acf0-9a83c140325b tempest-ServersTestManualDisk-2066344586 tempest-ServersTestManualDisk-2066344586-project-member] [instance: 02b1d6da-0aa2-4199-a86a-fa5b197b2813] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1266.591020] env[67964]: INFO nova.compute.manager [-] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] Took 0.04 seconds to deallocate network for instance. [ 1266.609713] env[67964]: DEBUG nova.compute.manager [None req-2a223a76-9979-4324-acf0-9a83c140325b tempest-ServersTestManualDisk-2066344586 tempest-ServersTestManualDisk-2066344586-project-member] [instance: 02b1d6da-0aa2-4199-a86a-fa5b197b2813] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1266.633172] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2a223a76-9979-4324-acf0-9a83c140325b tempest-ServersTestManualDisk-2066344586 tempest-ServersTestManualDisk-2066344586-project-member] Lock "02b1d6da-0aa2-4199-a86a-fa5b197b2813" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.085s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.642022] env[67964]: DEBUG nova.compute.manager [None req-d8e3d421-ba5f-4b4b-8141-6a42e305c2d2 tempest-InstanceActionsTestJSON-1556599388 tempest-InstanceActionsTestJSON-1556599388-project-member] [instance: b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1266.663543] env[67964]: DEBUG nova.compute.manager [None req-d8e3d421-ba5f-4b4b-8141-6a42e305c2d2 tempest-InstanceActionsTestJSON-1556599388 tempest-InstanceActionsTestJSON-1556599388-project-member] [instance: b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1266.688711] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d8e3d421-ba5f-4b4b-8141-6a42e305c2d2 tempest-InstanceActionsTestJSON-1556599388 tempest-InstanceActionsTestJSON-1556599388-project-member] Lock "b2cc5ba7-c5d1-4ecf-ba3a-fee3facbd159" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.926s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.690006] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c3d88bb4-3d9b-442a-97a3-fb5187204407 tempest-ListServerFiltersTestJSON-1359406749 tempest-ListServerFiltersTestJSON-1359406749-project-member] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.193s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.690757] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 106.973s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.690944] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 0768fe80-7dd3-42ec-8e22-42a6aece5bef] During sync_power_state the instance has a pending task (deleting). Skip. [ 1266.691125] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "0768fe80-7dd3-42ec-8e22-42a6aece5bef" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.698420] env[67964]: DEBUG nova.compute.manager [None req-0f1243af-7816-42a7-a312-8fc9b42208f3 tempest-ServerShowV257Test-1393857793 tempest-ServerShowV257Test-1393857793-project-member] [instance: 4cdc869e-2b97-4107-ae4d-49f99131048a] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1266.721457] env[67964]: DEBUG nova.compute.manager [None req-0f1243af-7816-42a7-a312-8fc9b42208f3 tempest-ServerShowV257Test-1393857793 tempest-ServerShowV257Test-1393857793-project-member] [instance: 4cdc869e-2b97-4107-ae4d-49f99131048a] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1266.742430] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0f1243af-7816-42a7-a312-8fc9b42208f3 tempest-ServerShowV257Test-1393857793 tempest-ServerShowV257Test-1393857793-project-member] Lock "4cdc869e-2b97-4107-ae4d-49f99131048a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.231s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.753796] env[67964]: DEBUG nova.compute.manager [None req-a275a597-95db-4676-8dbb-b8a4cecea5c7 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 7fe6f046-65c9-4464-931c-07e781c497aa] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1266.777202] env[67964]: DEBUG nova.compute.manager [None req-a275a597-95db-4676-8dbb-b8a4cecea5c7 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 7fe6f046-65c9-4464-931c-07e781c497aa] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1266.797792] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a275a597-95db-4676-8dbb-b8a4cecea5c7 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "7fe6f046-65c9-4464-931c-07e781c497aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.510s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.798164] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1266.799174] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1266.806729] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1266.852418] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.852666] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.854135] env[67964]: INFO nova.compute.claims [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1267.145528] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-903f9c53-228f-47e6-8138-305e16aa14b8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.153075] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9666c6c0-6b37-42cc-9756-9288f94272b8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.181476] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-284d32f2-00d0-4119-a50f-d1e856041152 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.188198] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5daea31a-4210-49e2-a2dc-f675abdd2022 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.200774] env[67964]: DEBUG nova.compute.provider_tree [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1267.211126] env[67964]: DEBUG nova.scheduler.client.report [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1267.224242] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.371s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.224705] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1267.255850] env[67964]: DEBUG nova.compute.utils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1267.257014] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1267.257201] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1267.265143] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1267.319296] env[67964]: DEBUG nova.policy [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78a904359f8042a687e6653a27f2d6fe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c347fb9ac8c94bdbb084c8fab4bc3fcc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1267.323506] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1267.350775] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1267.352039] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1267.352039] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1267.352039] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1267.352039] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1267.352039] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1267.352231] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1267.352231] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1267.352231] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1267.352310] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1267.352455] env[67964]: DEBUG nova.virt.hardware [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1267.353320] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f14ffb2-75e2-48fb-a5d0-d02151562142 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.361754] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa4709ae-1ec9-40a1-b877-2f8008d6930c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.641192] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Successfully created port: 8dba205a-5d0b-4f9f-9f78-3d44ccc4be79 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1268.326052] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Successfully updated port: 8dba205a-5d0b-4f9f-9f78-3d44ccc4be79 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1268.337616] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "refresh_cache-18c148fb-1cd4-4537-9b77-089e9b272f83" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.339761] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquired lock "refresh_cache-18c148fb-1cd4-4537-9b77-089e9b272f83" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.339761] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1268.378702] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1268.473872] env[67964]: DEBUG nova.compute.manager [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Received event network-vif-plugged-8dba205a-5d0b-4f9f-9f78-3d44ccc4be79 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1268.474102] env[67964]: DEBUG oslo_concurrency.lockutils [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] Acquiring lock "18c148fb-1cd4-4537-9b77-089e9b272f83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.474304] env[67964]: DEBUG oslo_concurrency.lockutils [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.474475] env[67964]: DEBUG oslo_concurrency.lockutils [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.474702] env[67964]: DEBUG nova.compute.manager [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] No waiting events found dispatching network-vif-plugged-8dba205a-5d0b-4f9f-9f78-3d44ccc4be79 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1268.474882] env[67964]: WARNING nova.compute.manager [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Received unexpected event network-vif-plugged-8dba205a-5d0b-4f9f-9f78-3d44ccc4be79 for instance with vm_state building and task_state spawning. [ 1268.475050] env[67964]: DEBUG nova.compute.manager [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Received event network-changed-8dba205a-5d0b-4f9f-9f78-3d44ccc4be79 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1268.475207] env[67964]: DEBUG nova.compute.manager [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Refreshing instance network info cache due to event network-changed-8dba205a-5d0b-4f9f-9f78-3d44ccc4be79. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1268.475368] env[67964]: DEBUG oslo_concurrency.lockutils [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] Acquiring lock "refresh_cache-18c148fb-1cd4-4537-9b77-089e9b272f83" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.540506] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Updating instance_info_cache with network_info: [{"id": "8dba205a-5d0b-4f9f-9f78-3d44ccc4be79", "address": "fa:16:3e:93:64:68", "network": {"id": "15c31caa-5025-443a-9a23-ad23f19b843a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1866596066-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c347fb9ac8c94bdbb084c8fab4bc3fcc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8dba205a-5d", "ovs_interfaceid": "8dba205a-5d0b-4f9f-9f78-3d44ccc4be79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.551462] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Releasing lock "refresh_cache-18c148fb-1cd4-4537-9b77-089e9b272f83" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.551742] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Instance network_info: |[{"id": "8dba205a-5d0b-4f9f-9f78-3d44ccc4be79", "address": "fa:16:3e:93:64:68", "network": {"id": "15c31caa-5025-443a-9a23-ad23f19b843a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1866596066-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c347fb9ac8c94bdbb084c8fab4bc3fcc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8dba205a-5d", "ovs_interfaceid": "8dba205a-5d0b-4f9f-9f78-3d44ccc4be79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1268.552038] env[67964]: DEBUG oslo_concurrency.lockutils [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] Acquired lock "refresh_cache-18c148fb-1cd4-4537-9b77-089e9b272f83" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.552220] env[67964]: DEBUG nova.network.neutron [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Refreshing network info cache for port 8dba205a-5d0b-4f9f-9f78-3d44ccc4be79 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1268.553169] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:93:64:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f54f7284-8f7d-47ee-839d-2143062cfe44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8dba205a-5d0b-4f9f-9f78-3d44ccc4be79', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1268.560688] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Creating folder: Project (c347fb9ac8c94bdbb084c8fab4bc3fcc). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1268.561531] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-793f21bd-854f-4be5-8dca-e6b0bf65f299 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.575654] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Created folder: Project (c347fb9ac8c94bdbb084c8fab4bc3fcc) in parent group-v690366. [ 1268.575833] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Creating folder: Instances. Parent ref: group-v690442. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1268.576069] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-578641dc-475a-4ad8-9578-329a652815d9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.585506] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Created folder: Instances in parent group-v690442. [ 1268.585720] env[67964]: DEBUG oslo.service.loopingcall [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1268.585892] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1268.586250] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a9acc64c-9376-4948-95fa-af5cce0781e6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.605803] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1268.605803] env[67964]: value = "task-3456807" [ 1268.605803] env[67964]: _type = "Task" [ 1268.605803] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1268.614488] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456807, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1268.800847] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1268.801173] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1268.801173] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1268.825346] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.825520] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.825653] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.825780] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.825901] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.826032] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.826157] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.826273] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.826389] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.826502] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1268.826681] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1268.830073] env[67964]: DEBUG nova.network.neutron [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Updated VIF entry in instance network info cache for port 8dba205a-5d0b-4f9f-9f78-3d44ccc4be79. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1268.830399] env[67964]: DEBUG nova.network.neutron [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Updating instance_info_cache with network_info: [{"id": "8dba205a-5d0b-4f9f-9f78-3d44ccc4be79", "address": "fa:16:3e:93:64:68", "network": {"id": "15c31caa-5025-443a-9a23-ad23f19b843a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1866596066-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c347fb9ac8c94bdbb084c8fab4bc3fcc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8dba205a-5d", "ovs_interfaceid": "8dba205a-5d0b-4f9f-9f78-3d44ccc4be79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.839633] env[67964]: DEBUG oslo_concurrency.lockutils [req-59dadc0b-1f4d-4372-befb-1da3220001e0 req-84249e02-ef40-4dc3-ab28-380e281f79ca service nova] Releasing lock "refresh_cache-18c148fb-1cd4-4537-9b77-089e9b272f83" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1269.115485] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456807, 'name': CreateVM_Task, 'duration_secs': 0.296646} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1269.115655] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1269.116352] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1269.116512] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1269.116883] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1269.117148] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8af560dc-7b72-49df-9117-dc773763532d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.121437] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Waiting for the task: (returnval){ [ 1269.121437] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52505117-e2f9-2496-8dbe-0ca8cdafa81b" [ 1269.121437] env[67964]: _type = "Task" [ 1269.121437] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1269.128697] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52505117-e2f9-2496-8dbe-0ca8cdafa81b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1269.632116] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1269.632387] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1269.632597] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1274.823587] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1314.767897] env[67964]: WARNING oslo_vmware.rw_handles [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1314.767897] env[67964]: ERROR oslo_vmware.rw_handles [ 1314.768798] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1314.770726] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1314.771032] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Copying Virtual Disk [datastore1] vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/56366e84-f9d5-492b-a670-9146ef8155b1/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1314.771262] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6afbca0f-4afa-4df4-b2ed-5b1314044786 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.779873] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Waiting for the task: (returnval){ [ 1314.779873] env[67964]: value = "task-3456808" [ 1314.779873] env[67964]: _type = "Task" [ 1314.779873] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1314.789322] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Task: {'id': task-3456808, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1315.290474] env[67964]: DEBUG oslo_vmware.exceptions [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1315.292036] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1315.292036] env[67964]: ERROR nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1315.292036] env[67964]: Faults: ['InvalidArgument'] [ 1315.292036] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Traceback (most recent call last): [ 1315.292036] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1315.292036] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] yield resources [ 1315.292036] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1315.292036] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self.driver.spawn(context, instance, image_meta, [ 1315.292036] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1315.292036] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self._fetch_image_if_missing(context, vi) [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] image_cache(vi, tmp_image_ds_loc) [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] vm_util.copy_virtual_disk( [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] session._wait_for_task(vmdk_copy_task) [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] return self.wait_for_task(task_ref) [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] return evt.wait() [ 1315.292391] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] result = hub.switch() [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] return self.greenlet.switch() [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self.f(*self.args, **self.kw) [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] raise exceptions.translate_fault(task_info.error) [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Faults: ['InvalidArgument'] [ 1315.292726] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] [ 1315.292726] env[67964]: INFO nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Terminating instance [ 1315.293415] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1315.293617] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1315.293852] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-23433512-4372-4422-936d-1d97c01bea41 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.296022] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1315.296218] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1315.296915] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d24817a-3846-47b3-af9c-86bdc64346bb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.304209] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1315.305119] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6f6bed49-bde6-435e-b1b5-9addedc9995c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.306442] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1315.306613] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1315.307296] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6653d92c-5f7f-4ff9-85db-be2b3e7f2cb7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.312685] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Waiting for the task: (returnval){ [ 1315.312685] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5205613a-9847-a57f-63c4-4c64738c3ad2" [ 1315.312685] env[67964]: _type = "Task" [ 1315.312685] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1315.321425] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5205613a-9847-a57f-63c4-4c64738c3ad2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1315.380174] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1315.380396] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1315.380576] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Deleting the datastore file [datastore1] 9e47d3ce-3897-458b-ac85-d98745e9aeb5 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1315.380841] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-430446a7-c266-4f74-9bd0-bc8a538e0d6a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.387428] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Waiting for the task: (returnval){ [ 1315.387428] env[67964]: value = "task-3456810" [ 1315.387428] env[67964]: _type = "Task" [ 1315.387428] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1315.395320] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Task: {'id': task-3456810, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1315.823211] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1315.823520] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Creating directory with path [datastore1] vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1315.823700] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b84d85a3-526f-4561-bae4-e333b8ef4cf1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.834859] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Created directory with path [datastore1] vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1315.835076] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Fetch image to [datastore1] vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1315.835284] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1315.836009] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdfcc5b4-a8f7-4e02-a025-d8e7710e83b6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.842415] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-606d4afb-f263-4ae8-a604-06eb551d5e62 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.851180] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96b8df02-5331-4ad8-bf3c-aee508a8e587 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.881914] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-830eefd7-a64d-4c10-921a-14f6a90a40d2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.887385] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8300add3-04bd-4fbd-9604-cb5c17638a29 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.896443] env[67964]: DEBUG oslo_vmware.api [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Task: {'id': task-3456810, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061172} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1315.896667] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1315.896841] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1315.897014] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1315.897191] env[67964]: INFO nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1315.899286] env[67964]: DEBUG nova.compute.claims [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1315.899453] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1315.899663] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1315.912883] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1315.966676] env[67964]: DEBUG oslo_vmware.rw_handles [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1316.026479] env[67964]: DEBUG oslo_vmware.rw_handles [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1316.026717] env[67964]: DEBUG oslo_vmware.rw_handles [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1316.245180] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca7e332e-17cb-42bf-84e1-29b730564081 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.253271] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41e938fa-3026-455f-a3f2-75770f2f855b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.282348] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cc2f0e9-0030-435b-95ee-a2f93cfc169f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.289551] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4619aacb-3945-4042-af40-14c309e4c955 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.303551] env[67964]: DEBUG nova.compute.provider_tree [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1316.313899] env[67964]: DEBUG nova.scheduler.client.report [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1316.333102] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.432s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1316.333102] env[67964]: ERROR nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1316.333102] env[67964]: Faults: ['InvalidArgument'] [ 1316.333102] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Traceback (most recent call last): [ 1316.333102] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1316.333102] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self.driver.spawn(context, instance, image_meta, [ 1316.333102] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1316.333102] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1316.333102] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1316.333102] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self._fetch_image_if_missing(context, vi) [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] image_cache(vi, tmp_image_ds_loc) [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] vm_util.copy_virtual_disk( [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] session._wait_for_task(vmdk_copy_task) [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] return self.wait_for_task(task_ref) [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] return evt.wait() [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] result = hub.switch() [ 1316.333361] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] return self.greenlet.switch() [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] self.f(*self.args, **self.kw) [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] raise exceptions.translate_fault(task_info.error) [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Faults: ['InvalidArgument'] [ 1316.333667] env[67964]: ERROR nova.compute.manager [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] [ 1316.333667] env[67964]: DEBUG nova.compute.utils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1316.334680] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Build of instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 was re-scheduled: A specified parameter was not correct: fileType [ 1316.334680] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1316.335046] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1316.335248] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1316.335441] env[67964]: DEBUG nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1316.335629] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1316.636786] env[67964]: DEBUG nova.network.neutron [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1316.655107] env[67964]: INFO nova.compute.manager [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Took 0.32 seconds to deallocate network for instance. [ 1316.750698] env[67964]: INFO nova.scheduler.client.report [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Deleted allocations for instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 [ 1316.770354] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d161b52-1427-4c87-8af0-b2ae207130f1 tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 677.494s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1316.771480] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 480.518s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1316.771697] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Acquiring lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1316.771901] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1316.772078] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1316.773997] env[67964]: INFO nova.compute.manager [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Terminating instance [ 1316.776062] env[67964]: DEBUG nova.compute.manager [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1316.776276] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1316.776779] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bef799ed-164c-455a-b0f8-64aa4055ef9e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.786910] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de8918c5-1d31-4bad-9c5c-cad051f4243b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.797318] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1316.818209] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9e47d3ce-3897-458b-ac85-d98745e9aeb5 could not be found. [ 1316.818448] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1316.818637] env[67964]: INFO nova.compute.manager [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1316.818869] env[67964]: DEBUG oslo.service.loopingcall [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1316.819107] env[67964]: DEBUG nova.compute.manager [-] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1316.819203] env[67964]: DEBUG nova.network.neutron [-] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1316.844207] env[67964]: DEBUG nova.network.neutron [-] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1316.846524] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1316.846752] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1316.848227] env[67964]: INFO nova.compute.claims [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1316.851898] env[67964]: INFO nova.compute.manager [-] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] Took 0.03 seconds to deallocate network for instance. [ 1316.942108] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c2f5389e-f1da-449a-8ed0-e6bb3b31e53f tempest-ServersTestFqdnHostnames-649911949 tempest-ServersTestFqdnHostnames-649911949-project-member] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.170s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1316.943221] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 157.225s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1316.943755] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9e47d3ce-3897-458b-ac85-d98745e9aeb5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1316.944025] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "9e47d3ce-3897-458b-ac85-d98745e9aeb5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1317.127529] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c4e5ec0-1de3-44eb-bc0f-e33a14e3874b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.135203] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed459b3d-d268-44ed-a51f-c92af9125ffd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.164652] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3ee67d5-6821-42ad-87bf-493210023472 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.172066] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0f8ce92-cf96-43d8-80e7-1fa8e584cef2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.184735] env[67964]: DEBUG nova.compute.provider_tree [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1317.193635] env[67964]: DEBUG nova.scheduler.client.report [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1317.209904] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1317.210421] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1317.246603] env[67964]: DEBUG nova.compute.utils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1317.248295] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1317.248463] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1317.256542] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1317.319870] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1317.323409] env[67964]: DEBUG nova.policy [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '14ff35bdbab24b67babe14468977fb5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8e8e324632da417faf1891a9aafc8e09', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1317.344779] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1317.344997] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1317.345192] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1317.345391] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1317.345538] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1317.345681] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1317.345885] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1317.346064] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1317.346239] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1317.346400] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1317.346570] env[67964]: DEBUG nova.virt.hardware [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1317.347453] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e94ed486-48c8-41a1-a566-d470c2549aea {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.355579] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e3c49c1-8607-44df-8f8f-49f0358b2d7b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.657278] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Successfully created port: 94a0f415-99b6-44d9-ac3b-e61e1b45c5bc {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1318.329997] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Successfully updated port: 94a0f415-99b6-44d9-ac3b-e61e1b45c5bc {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1318.343462] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "refresh_cache-18d6df82-a19a-499a-8874-171218569651" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1318.343620] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquired lock "refresh_cache-18d6df82-a19a-499a-8874-171218569651" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1318.343765] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1318.390891] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1318.599164] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Updating instance_info_cache with network_info: [{"id": "94a0f415-99b6-44d9-ac3b-e61e1b45c5bc", "address": "fa:16:3e:d2:11:8d", "network": {"id": "91951e36-89b8-4909-ae5f-6925f9a94952", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-452928727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8e8e324632da417faf1891a9aafc8e09", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94a0f415-99", "ovs_interfaceid": "94a0f415-99b6-44d9-ac3b-e61e1b45c5bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1318.611853] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Releasing lock "refresh_cache-18d6df82-a19a-499a-8874-171218569651" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1318.612184] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Instance network_info: |[{"id": "94a0f415-99b6-44d9-ac3b-e61e1b45c5bc", "address": "fa:16:3e:d2:11:8d", "network": {"id": "91951e36-89b8-4909-ae5f-6925f9a94952", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-452928727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8e8e324632da417faf1891a9aafc8e09", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94a0f415-99", "ovs_interfaceid": "94a0f415-99b6-44d9-ac3b-e61e1b45c5bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1318.612610] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d2:11:8d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '47499d09-8010-4d02-ac96-4f057c104692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '94a0f415-99b6-44d9-ac3b-e61e1b45c5bc', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1318.620354] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Creating folder: Project (8e8e324632da417faf1891a9aafc8e09). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1318.620874] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-47f02ea0-03da-481e-8c3f-22e04b56712b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.631257] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Created folder: Project (8e8e324632da417faf1891a9aafc8e09) in parent group-v690366. [ 1318.631458] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Creating folder: Instances. Parent ref: group-v690445. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1318.631649] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e90ecdac-3d70-46f7-a727-2fa0e5147ffe {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.641942] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Created folder: Instances in parent group-v690445. [ 1318.642188] env[67964]: DEBUG oslo.service.loopingcall [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1318.642374] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18d6df82-a19a-499a-8874-171218569651] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1318.642555] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1d0b0a2e-2a2f-4e56-ad10-a5327ca610c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.661043] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1318.661043] env[67964]: value = "task-3456813" [ 1318.661043] env[67964]: _type = "Task" [ 1318.661043] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1318.667752] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456813, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1318.679053] env[67964]: DEBUG nova.compute.manager [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] Received event network-vif-plugged-94a0f415-99b6-44d9-ac3b-e61e1b45c5bc {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1318.679267] env[67964]: DEBUG oslo_concurrency.lockutils [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] Acquiring lock "18d6df82-a19a-499a-8874-171218569651-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1318.679457] env[67964]: DEBUG oslo_concurrency.lockutils [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] Lock "18d6df82-a19a-499a-8874-171218569651-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.679672] env[67964]: DEBUG oslo_concurrency.lockutils [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] Lock "18d6df82-a19a-499a-8874-171218569651-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.679799] env[67964]: DEBUG nova.compute.manager [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] No waiting events found dispatching network-vif-plugged-94a0f415-99b6-44d9-ac3b-e61e1b45c5bc {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1318.679964] env[67964]: WARNING nova.compute.manager [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] Received unexpected event network-vif-plugged-94a0f415-99b6-44d9-ac3b-e61e1b45c5bc for instance with vm_state building and task_state spawning. [ 1318.680115] env[67964]: DEBUG nova.compute.manager [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] Received event network-changed-94a0f415-99b6-44d9-ac3b-e61e1b45c5bc {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1318.680266] env[67964]: DEBUG nova.compute.manager [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] Refreshing instance network info cache due to event network-changed-94a0f415-99b6-44d9-ac3b-e61e1b45c5bc. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1318.680444] env[67964]: DEBUG oslo_concurrency.lockutils [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] Acquiring lock "refresh_cache-18d6df82-a19a-499a-8874-171218569651" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1318.680573] env[67964]: DEBUG oslo_concurrency.lockutils [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] Acquired lock "refresh_cache-18d6df82-a19a-499a-8874-171218569651" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1318.680720] env[67964]: DEBUG nova.network.neutron [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] Refreshing network info cache for port 94a0f415-99b6-44d9-ac3b-e61e1b45c5bc {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1318.953840] env[67964]: DEBUG nova.network.neutron [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] Updated VIF entry in instance network info cache for port 94a0f415-99b6-44d9-ac3b-e61e1b45c5bc. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1318.954217] env[67964]: DEBUG nova.network.neutron [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] [instance: 18d6df82-a19a-499a-8874-171218569651] Updating instance_info_cache with network_info: [{"id": "94a0f415-99b6-44d9-ac3b-e61e1b45c5bc", "address": "fa:16:3e:d2:11:8d", "network": {"id": "91951e36-89b8-4909-ae5f-6925f9a94952", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-452928727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8e8e324632da417faf1891a9aafc8e09", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94a0f415-99", "ovs_interfaceid": "94a0f415-99b6-44d9-ac3b-e61e1b45c5bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1318.964282] env[67964]: DEBUG oslo_concurrency.lockutils [req-e660db6e-bb8c-4d61-bb6a-5c44432994ba req-5a8ed3f1-d516-4a25-8048-8f0451861887 service nova] Releasing lock "refresh_cache-18d6df82-a19a-499a-8874-171218569651" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1319.171531] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456813, 'name': CreateVM_Task, 'duration_secs': 0.317169} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1319.171531] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18d6df82-a19a-499a-8874-171218569651] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1319.172499] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1319.172499] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1319.172499] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1319.172659] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-927c8952-1b33-4c90-97ab-cc6623e1e963 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.177155] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Waiting for the task: (returnval){ [ 1319.177155] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52f8e3f4-40fe-2768-8320-e4e4d8b2ec33" [ 1319.177155] env[67964]: _type = "Task" [ 1319.177155] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1319.184561] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52f8e3f4-40fe-2768-8320-e4e4d8b2ec33, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1319.687796] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1319.688108] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1319.688465] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1322.800142] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1322.800391] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1322.800517] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1322.800666] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1322.811773] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1322.811985] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1322.812163] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.812318] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1322.813402] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f5feafa-211a-4979-9d35-152c00f36db6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.822150] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c517e4f6-7b80-469f-8d06-ffb5106a966a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.836965] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f6b4a5b-4be0-465d-8fb6-21a4bdd23be9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.843056] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19d357fa-cdc4-421c-93bd-eeeecae343d6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.871408] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180915MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1322.871541] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1322.871725] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1322.945351] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea492fb8-2352-436c-a7d5-f20423f4d353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.945528] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.945656] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.945779] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.945897] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.946028] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.946159] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.946274] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.946387] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.946647] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1322.957437] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1322.970674] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1322.982397] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1322.993594] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fb025130-d995-4615-8dee-59af1700877f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.004985] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 60cd7925-3124-449e-8d27-4faa7b27cb9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.014878] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.024431] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0f126555-f26e-42da-a468-28a28887c901 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.035028] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0228456f-0055-43b9-9a81-e0f031e2a549 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.045172] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance a0908e14-521d-42c1-baaa-b5863b1f142d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.058052] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d64969c7-d467-4958-8b04-aa2d2920769a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.058052] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1323.058052] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1323.264500] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be593e34-0dfe-4781-a7ff-131edfcb8718 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.272085] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9223613a-c04e-4bbb-8d9a-63dee4d44476 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.302106] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6fba09d-4d8e-4823-8c80-6f058d50eb00 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.309286] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ed52b1-d999-4683-8b40-12c3847ab010 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.322244] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1323.331160] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1323.345607] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1323.345779] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.474s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1324.346580] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1324.800320] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1326.800375] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1327.795435] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1327.800070] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1330.801738] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1330.802121] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1330.802121] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1330.825923] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826110] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826241] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826366] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826485] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826606] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826725] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826842] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.826958] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.827087] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18d6df82-a19a-499a-8874-171218569651] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1330.827207] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1335.736765] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "18c148fb-1cd4-4537-9b77-089e9b272f83" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.605725] env[67964]: WARNING oslo_vmware.rw_handles [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1362.605725] env[67964]: ERROR oslo_vmware.rw_handles [ 1362.605725] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1362.608188] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1362.608445] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Copying Virtual Disk [datastore1] vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/5ddff9b4-2452-4701-a805-e4b83feb77fa/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1362.608718] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-47e597ad-4edd-41e6-a0b5-f982af1053e9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.616293] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Waiting for the task: (returnval){ [ 1362.616293] env[67964]: value = "task-3456814" [ 1362.616293] env[67964]: _type = "Task" [ 1362.616293] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1362.625488] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Task: {'id': task-3456814, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.126115] env[67964]: DEBUG oslo_vmware.exceptions [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1363.126356] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.126938] env[67964]: ERROR nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1363.126938] env[67964]: Faults: ['InvalidArgument'] [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Traceback (most recent call last): [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] yield resources [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self.driver.spawn(context, instance, image_meta, [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self._fetch_image_if_missing(context, vi) [ 1363.126938] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] image_cache(vi, tmp_image_ds_loc) [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] vm_util.copy_virtual_disk( [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] session._wait_for_task(vmdk_copy_task) [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] return self.wait_for_task(task_ref) [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] return evt.wait() [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] result = hub.switch() [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1363.127298] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] return self.greenlet.switch() [ 1363.127662] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1363.127662] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self.f(*self.args, **self.kw) [ 1363.127662] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1363.127662] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] raise exceptions.translate_fault(task_info.error) [ 1363.127662] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1363.127662] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Faults: ['InvalidArgument'] [ 1363.127662] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] [ 1363.127662] env[67964]: INFO nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Terminating instance [ 1363.128835] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1363.129073] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1363.129324] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c170bf4c-d27c-40dc-b29c-95827343ada7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.131480] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1363.131665] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1363.132431] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16587c3d-d634-4202-a4c9-dc40adc07c23 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.139534] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1363.139770] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-49654e8f-ae1c-49f1-81c8-d8fa498c3055 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.141960] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1363.142149] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1363.143092] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-25d1d65b-e256-4bb6-8104-bb66a1b27884 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.147584] env[67964]: DEBUG oslo_vmware.api [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for the task: (returnval){ [ 1363.147584] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ac2fca-466f-88c7-820d-f7ddcd0ee727" [ 1363.147584] env[67964]: _type = "Task" [ 1363.147584] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.154771] env[67964]: DEBUG oslo_vmware.api [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ac2fca-466f-88c7-820d-f7ddcd0ee727, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.214186] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1363.214412] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1363.214590] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Deleting the datastore file [datastore1] ea492fb8-2352-436c-a7d5-f20423f4d353 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1363.214873] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-48d2294d-c5dd-480b-9fff-75e6e35290e8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.221521] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Waiting for the task: (returnval){ [ 1363.221521] env[67964]: value = "task-3456816" [ 1363.221521] env[67964]: _type = "Task" [ 1363.221521] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.229426] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Task: {'id': task-3456816, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.658734] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1363.659093] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Creating directory with path [datastore1] vmware_temp/98f46b2e-5119-4c97-a801-892d6ec843d5/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1363.659256] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f4b62bfc-2b20-468d-a0c2-6d6f0d85823c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.671510] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Created directory with path [datastore1] vmware_temp/98f46b2e-5119-4c97-a801-892d6ec843d5/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1363.671743] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Fetch image to [datastore1] vmware_temp/98f46b2e-5119-4c97-a801-892d6ec843d5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1363.671919] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/98f46b2e-5119-4c97-a801-892d6ec843d5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1363.672688] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b38bd393-a835-4c88-b367-fc85091d5d5f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.679468] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9842d76d-17c3-423a-8bfb-e6d03f51b198 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.689063] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6f286e6-8ada-444d-97e5-6fdc892cdaef {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.718766] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab33b7fc-da59-422b-be6d-3ee11b68bb79 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.728311] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-785c1dc3-d591-4120-9c26-b093d12ebebb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.733436] env[67964]: DEBUG oslo_vmware.api [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Task: {'id': task-3456816, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063126} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1363.733666] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1363.733845] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1363.734016] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1363.734235] env[67964]: INFO nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1363.736385] env[67964]: DEBUG nova.compute.claims [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1363.736632] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1363.736939] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1363.749361] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1363.975640] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.977291] env[67964]: ERROR nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = getattr(controller, method)(*args, **kwargs) [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._get(image_id) [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1363.977291] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] resp, body = self.http_client.get(url, headers=header) [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.request(url, 'GET', **kwargs) [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._handle_response(resp) [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exc.from_response(resp, resp.content) [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During handling of the above exception, another exception occurred: [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1363.977603] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] yield resources [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self.driver.spawn(context, instance, image_meta, [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._fetch_image_if_missing(context, vi) [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image_fetch(context, vi, tmp_image_ds_loc) [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] images.fetch_image( [ 1363.977965] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] metadata = IMAGE_API.get(context, image_ref) [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return session.show(context, image_id, [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] _reraise_translated_image_exception(image_id) [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise new_exc.with_traceback(exc_trace) [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = getattr(controller, method)(*args, **kwargs) [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1363.978331] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._get(image_id) [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] resp, body = self.http_client.get(url, headers=header) [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.request(url, 'GET', **kwargs) [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._handle_response(resp) [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exc.from_response(resp, resp.content) [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] nova.exception.ImageNotAuthorized: Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. [ 1363.978677] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1363.978995] env[67964]: INFO nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Terminating instance [ 1363.979234] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1363.979438] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1363.982079] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1363.982277] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1363.982524] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f21bb9b-a3ad-4d31-a5bf-1756dc1d3c43 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.985495] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cae11b3f-475d-477e-b1ac-7df6961c1530 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.992955] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1363.993204] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-79bb64c4-1895-4cac-a4fd-ed43cc1d3811 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.995368] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1363.995544] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1363.996561] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1cf100fc-3bc8-43c7-8a0c-90891791eec2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.003615] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 1364.003615] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]521bfb3f-6156-31f4-3cb0-9e6ff208555b" [ 1364.003615] env[67964]: _type = "Task" [ 1364.003615] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1364.014724] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]521bfb3f-6156-31f4-3cb0-9e6ff208555b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.033395] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da3897af-d8a0-4ecd-8ed8-6eabe69f2630 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.040800] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d05a62b-003e-4704-83be-3eec7bb00466 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.071836] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7baa3812-aff5-432e-a5e8-fbd4254e4a29 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.074451] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1364.074695] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1364.074902] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Deleting the datastore file [datastore1] c648c89a-ca70-4a15-9083-0cbe9e5bee23 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1364.075198] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-47cb32ab-c008-460b-ad01-fd3b42c5ed04 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.081409] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd4b5154-3610-4956-9927-07e1895da4cd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.085785] env[67964]: DEBUG oslo_vmware.api [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for the task: (returnval){ [ 1364.085785] env[67964]: value = "task-3456818" [ 1364.085785] env[67964]: _type = "Task" [ 1364.085785] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1364.098293] env[67964]: DEBUG nova.compute.provider_tree [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1364.101936] env[67964]: DEBUG oslo_vmware.api [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': task-3456818, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.106790] env[67964]: DEBUG nova.scheduler.client.report [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1364.121825] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.385s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1364.122358] env[67964]: ERROR nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1364.122358] env[67964]: Faults: ['InvalidArgument'] [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Traceback (most recent call last): [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self.driver.spawn(context, instance, image_meta, [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self._fetch_image_if_missing(context, vi) [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] image_cache(vi, tmp_image_ds_loc) [ 1364.122358] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] vm_util.copy_virtual_disk( [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] session._wait_for_task(vmdk_copy_task) [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] return self.wait_for_task(task_ref) [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] return evt.wait() [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] result = hub.switch() [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] return self.greenlet.switch() [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1364.122721] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] self.f(*self.args, **self.kw) [ 1364.123089] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1364.123089] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] raise exceptions.translate_fault(task_info.error) [ 1364.123089] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1364.123089] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Faults: ['InvalidArgument'] [ 1364.123089] env[67964]: ERROR nova.compute.manager [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] [ 1364.123089] env[67964]: DEBUG nova.compute.utils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1364.124370] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Build of instance ea492fb8-2352-436c-a7d5-f20423f4d353 was re-scheduled: A specified parameter was not correct: fileType [ 1364.124370] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1364.124731] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1364.124900] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1364.125081] env[67964]: DEBUG nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1364.125252] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1364.521550] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1364.521824] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating directory with path [datastore1] vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1364.522089] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e29a49e5-4058-490f-8d9e-8399036db653 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.534312] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created directory with path [datastore1] vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1364.534494] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Fetch image to [datastore1] vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1364.534793] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1364.535583] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9464e974-bd8d-43a6-be46-e164a5fbf147 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.546943] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea2562e1-64c9-4b67-8c74-574f6dda48c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.556421] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd281863-c9d2-4065-9000-fb5b5d042595 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.560895] env[67964]: DEBUG nova.network.neutron [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1364.589907] env[67964]: INFO nova.compute.manager [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Took 0.46 seconds to deallocate network for instance. [ 1364.595359] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d09495c7-5ee6-42b0-85ce-2e3da2328f3c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.605030] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d95b989f-e3fb-4a35-b51e-c97d83505916 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.609356] env[67964]: DEBUG oslo_vmware.api [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Task: {'id': task-3456818, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074922} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1364.609356] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1364.609356] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1364.609356] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1364.609356] env[67964]: INFO nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1364.609840] env[67964]: DEBUG nova.compute.claims [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1364.610030] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1364.610296] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1364.630930] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1364.692075] env[67964]: INFO nova.scheduler.client.report [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Deleted allocations for instance ea492fb8-2352-436c-a7d5-f20423f4d353 [ 1364.701383] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1364.760660] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76bcb309-4ca0-439f-a5ed-3b547d83ad8d tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 622.499s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1364.761964] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 426.686s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1364.762195] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Acquiring lock "ea492fb8-2352-436c-a7d5-f20423f4d353-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1364.762393] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1364.762554] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1364.765719] env[67964]: INFO nova.compute.manager [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Terminating instance [ 1364.767188] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1364.767353] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1364.768347] env[67964]: DEBUG nova.compute.manager [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1364.768536] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1364.768780] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b7353cce-b77b-4f40-b8d6-737744c31e48 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.774373] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1364.784246] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cd25fbc-d4d0-43eb-9f0a-f10f098ddd8b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.816723] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ea492fb8-2352-436c-a7d5-f20423f4d353 could not be found. [ 1364.820021] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1364.820021] env[67964]: INFO nova.compute.manager [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1364.820021] env[67964]: DEBUG oslo.service.loopingcall [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1364.820021] env[67964]: DEBUG nova.compute.manager [-] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1364.820021] env[67964]: DEBUG nova.network.neutron [-] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1364.833013] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1364.846758] env[67964]: DEBUG nova.network.neutron [-] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1364.854828] env[67964]: INFO nova.compute.manager [-] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] Took 0.04 seconds to deallocate network for instance. [ 1364.949833] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e748d981-02ef-4eda-8dae-344b3aced10e tempest-ServerRescueTestJSON-398796465 tempest-ServerRescueTestJSON-398796465-project-member] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.188s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1364.950786] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 205.233s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1364.950974] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea492fb8-2352-436c-a7d5-f20423f4d353] During sync_power_state the instance has a pending task (deleting). Skip. [ 1364.951162] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ea492fb8-2352-436c-a7d5-f20423f4d353" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1364.988784] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ef4b3cc-d7e7-4d72-9ee2-95f2d7e32f08 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.997502] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2880113f-0ce2-41fe-b03b-eab79124c0ca {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.028178] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1fef324-0b13-4dd4-898e-e9e1311c01c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.035724] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f81112fc-17c8-40e6-8250-cc59ca9a71e5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.048765] env[67964]: DEBUG nova.compute.provider_tree [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1365.056588] env[67964]: DEBUG nova.scheduler.client.report [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1365.068651] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.458s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.069384] env[67964]: ERROR nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = getattr(controller, method)(*args, **kwargs) [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._get(image_id) [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.069384] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] resp, body = self.http_client.get(url, headers=header) [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.request(url, 'GET', **kwargs) [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._handle_response(resp) [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exc.from_response(resp, resp.content) [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During handling of the above exception, another exception occurred: [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.069704] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self.driver.spawn(context, instance, image_meta, [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._fetch_image_if_missing(context, vi) [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image_fetch(context, vi, tmp_image_ds_loc) [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] images.fetch_image( [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] metadata = IMAGE_API.get(context, image_ref) [ 1365.070091] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return session.show(context, image_id, [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] _reraise_translated_image_exception(image_id) [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise new_exc.with_traceback(exc_trace) [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = getattr(controller, method)(*args, **kwargs) [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._get(image_id) [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.071306] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] resp, body = self.http_client.get(url, headers=header) [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.request(url, 'GET', **kwargs) [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._handle_response(resp) [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exc.from_response(resp, resp.content) [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] nova.exception.ImageNotAuthorized: Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. [ 1365.071608] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.071829] env[67964]: DEBUG nova.compute.utils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1365.071829] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.238s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.072580] env[67964]: INFO nova.compute.claims [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1365.075092] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Build of instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 was re-scheduled: Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1365.075562] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1365.075760] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1365.075908] env[67964]: DEBUG nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1365.076127] env[67964]: DEBUG nova.network.neutron [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1365.198225] env[67964]: DEBUG neutronclient.v2_0.client [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67964) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1365.200714] env[67964]: ERROR nova.compute.manager [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = getattr(controller, method)(*args, **kwargs) [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._get(image_id) [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.200714] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] resp, body = self.http_client.get(url, headers=header) [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.request(url, 'GET', **kwargs) [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._handle_response(resp) [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exc.from_response(resp, resp.content) [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During handling of the above exception, another exception occurred: [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.201063] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self.driver.spawn(context, instance, image_meta, [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._fetch_image_if_missing(context, vi) [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image_fetch(context, vi, tmp_image_ds_loc) [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] images.fetch_image( [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] metadata = IMAGE_API.get(context, image_ref) [ 1365.201375] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return session.show(context, image_id, [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] _reraise_translated_image_exception(image_id) [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise new_exc.with_traceback(exc_trace) [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = getattr(controller, method)(*args, **kwargs) [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._get(image_id) [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.201709] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] resp, body = self.http_client.get(url, headers=header) [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.request(url, 'GET', **kwargs) [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self._handle_response(resp) [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exc.from_response(resp, resp.content) [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] nova.exception.ImageNotAuthorized: Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During handling of the above exception, another exception occurred: [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.202067] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._build_and_run_instance(context, instance, image, [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exception.RescheduledException( [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] nova.exception.RescheduledException: Build of instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 was re-scheduled: Not authorized for image b261268a-9800-40a9-afde-85d61f8eed6a. [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During handling of the above exception, another exception occurred: [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.202400] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] exception_handler_v20(status_code, error_body) [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise client_exc(message=error_message, [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Neutron server returns request_ids: ['req-1e71ed6f-754b-4d27-bb05-d93a9c7af972'] [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During handling of the above exception, another exception occurred: [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 3020, in _cleanup_allocated_networks [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._deallocate_network(context, instance, requested_networks) [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self.network_api.deallocate_for_instance( [ 1365.202742] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] data = neutron.list_ports(**search_opts) [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.list('ports', self.ports_path, retrieve_all, [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] for r in self._pagination(collection, path, **params): [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] res = self.get(path, params=params) [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.203152] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.retry_request("GET", action, body=body, [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.do_request(method, action, body=body, [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._handle_fault_response(status_code, replybody, resp) [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exception.Unauthorized() [ 1365.203501] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] nova.exception.Unauthorized: Not authorized. [ 1365.203846] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.256276] env[67964]: INFO nova.scheduler.client.report [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Deleted allocations for instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 [ 1365.275072] env[67964]: DEBUG oslo_concurrency.lockutils [None req-313a43d8-13f1-4c9b-931f-5106eac4e91e tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 582.204s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.276151] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 385.247s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.276427] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Acquiring lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.276639] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.276808] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.278751] env[67964]: INFO nova.compute.manager [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Terminating instance [ 1365.280395] env[67964]: DEBUG nova.compute.manager [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1365.280589] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1365.281083] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-27baaf50-1e31-4eba-a59c-97ec96071924 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.291008] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1365.297445] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-479f1542-07ed-476a-95a7-2e23b376c1a6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.332507] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c648c89a-ca70-4a15-9083-0cbe9e5bee23 could not be found. [ 1365.332712] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1365.332997] env[67964]: INFO nova.compute.manager [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1365.333158] env[67964]: DEBUG oslo.service.loopingcall [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1365.335344] env[67964]: DEBUG nova.compute.manager [-] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1365.335413] env[67964]: DEBUG nova.network.neutron [-] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1365.348783] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.375514] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0e2103d-981e-4e4a-9d79-7ea26984677f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.383296] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21e770bb-b418-4695-9c63-6f41dd00d877 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.413263] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b753578f-9cb9-4389-85d3-7bfbf9e03040 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.419943] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b371f73-a5eb-4a99-ae47-01ac9371044c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.432585] env[67964]: DEBUG nova.compute.provider_tree [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1365.434030] env[67964]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67964) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1365.434264] env[67964]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-93e0e7fb-3b1f-4135-9630-26bb382ca288'] [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1365.434925] env[67964]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.435427] env[67964]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.435900] env[67964]: ERROR oslo.service.loopingcall [ 1365.436340] env[67964]: ERROR nova.compute.manager [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.442343] env[67964]: DEBUG nova.scheduler.client.report [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1365.455017] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.384s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.455483] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1365.457698] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.109s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.459195] env[67964]: INFO nova.compute.claims [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1365.476615] env[67964]: ERROR nova.compute.manager [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] exception_handler_v20(status_code, error_body) [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise client_exc(message=error_message, [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Neutron server returns request_ids: ['req-93e0e7fb-3b1f-4135-9630-26bb382ca288'] [ 1365.476615] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During handling of the above exception, another exception occurred: [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Traceback (most recent call last): [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._delete_instance(context, instance, bdms) [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._shutdown_instance(context, instance, bdms) [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._try_deallocate_network(context, instance, requested_networks) [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] with excutils.save_and_reraise_exception(): [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.477058] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self.force_reraise() [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise self.value [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] _deallocate_network_with_retries() [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return evt.wait() [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = hub.switch() [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.greenlet.switch() [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1365.477406] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = func(*self.args, **self.kw) [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] result = f(*args, **kwargs) [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._deallocate_network( [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self.network_api.deallocate_for_instance( [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] data = neutron.list_ports(**search_opts) [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.list('ports', self.ports_path, retrieve_all, [ 1365.477730] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] for r in self._pagination(collection, path, **params): [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] res = self.get(path, params=params) [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.retry_request("GET", action, body=body, [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.478119] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] return self.do_request(method, action, body=body, [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] ret = obj(*args, **kwargs) [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] self._handle_fault_response(status_code, replybody, resp) [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.478451] env[67964]: ERROR nova.compute.manager [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] [ 1365.505070] env[67964]: DEBUG nova.compute.utils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1365.506257] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1365.506428] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1365.510201] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.234s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.511861] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 205.793s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.512063] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] During sync_power_state the instance has a pending task (deleting). Skip. [ 1365.512379] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "c648c89a-ca70-4a15-9083-0cbe9e5bee23" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.516791] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1365.578065] env[67964]: DEBUG nova.policy [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9fc77d3396842ed87ae657b8d6e1dbc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67838ada47314689881a641ad7dcf20e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1365.580795] env[67964]: INFO nova.compute.manager [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] [instance: c648c89a-ca70-4a15-9083-0cbe9e5bee23] Successfully reverted task state from None on failure for instance. [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server [None req-8641f36e-f507-4588-a528-32f8a73009b2 tempest-DeleteServersAdminTestJSON-788363875 tempest-DeleteServersAdminTestJSON-788363875-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-93e0e7fb-3b1f-4135-9630-26bb382ca288'] [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1365.587437] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.587833] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1365.588258] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.588649] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1365.589067] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1365.589480] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.589876] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.589876] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.589876] env[67964]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1365.589876] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1365.589876] env[67964]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1365.589876] env[67964]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.589876] env[67964]: ERROR oslo_messaging.rpc.server [ 1365.603061] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1365.638057] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1365.638377] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1365.640795] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1365.640795] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1365.640795] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1365.640795] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1365.640795] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1365.641067] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1365.641067] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1365.641067] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1365.641067] env[67964]: DEBUG nova.virt.hardware [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1365.641215] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6093c21-05f8-44b0-b720-151410c0c3e0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.651821] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e37af3-d81b-46d0-936c-12b1477058b4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.767485] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ef75c2a-97d4-4f47-8420-a674a7775ccd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.775137] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-486eb40c-45fb-4159-8a2e-6ab9a6d74df1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.804906] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99838683-535b-422d-9e1e-f0289f51b2f8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.812186] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7475b161-d536-44ae-b53f-d92e8c803138 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.825735] env[67964]: DEBUG nova.compute.provider_tree [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1365.835674] env[67964]: DEBUG nova.scheduler.client.report [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1365.850738] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.392s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.850738] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1365.885266] env[67964]: DEBUG nova.compute.utils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1365.889444] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1365.889444] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1365.896113] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1365.988789] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1366.016769] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1366.017043] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1366.017213] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1366.017394] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1366.017537] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1366.017683] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1366.017903] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1366.018480] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1366.018700] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1366.018890] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1366.019099] env[67964]: DEBUG nova.virt.hardware [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1366.019948] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-567695e7-c970-4da0-8505-bdd45acf5569 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.023212] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Successfully created port: 9c5c75c6-9099-4d95-91b7-a9c3aedc9d05 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1366.026011] env[67964]: DEBUG nova.policy [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cb0fcc8c390a4451a06d2ff90ef85253', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b28e13db1c6747e9b6c9fef34def6923', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1366.032881] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f27279e2-30c6-458a-8300-bc4b710b208b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.831942] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Successfully created port: cee7104a-b710-4cde-9e16-9d0de09d87e5 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1367.194387] env[67964]: DEBUG nova.compute.manager [req-e72f2f6d-7e14-41f3-8114-1321fe21c5b9 req-b9f3fbe8-bb93-46b6-9141-22fa5316b58e service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Received event network-vif-plugged-9c5c75c6-9099-4d95-91b7-a9c3aedc9d05 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1367.194387] env[67964]: DEBUG oslo_concurrency.lockutils [req-e72f2f6d-7e14-41f3-8114-1321fe21c5b9 req-b9f3fbe8-bb93-46b6-9141-22fa5316b58e service nova] Acquiring lock "ee34b117-806d-4cc4-98b7-0f40f074cfab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1367.194387] env[67964]: DEBUG oslo_concurrency.lockutils [req-e72f2f6d-7e14-41f3-8114-1321fe21c5b9 req-b9f3fbe8-bb93-46b6-9141-22fa5316b58e service nova] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.194387] env[67964]: DEBUG oslo_concurrency.lockutils [req-e72f2f6d-7e14-41f3-8114-1321fe21c5b9 req-b9f3fbe8-bb93-46b6-9141-22fa5316b58e service nova] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.194777] env[67964]: DEBUG nova.compute.manager [req-e72f2f6d-7e14-41f3-8114-1321fe21c5b9 req-b9f3fbe8-bb93-46b6-9141-22fa5316b58e service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] No waiting events found dispatching network-vif-plugged-9c5c75c6-9099-4d95-91b7-a9c3aedc9d05 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1367.194777] env[67964]: WARNING nova.compute.manager [req-e72f2f6d-7e14-41f3-8114-1321fe21c5b9 req-b9f3fbe8-bb93-46b6-9141-22fa5316b58e service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Received unexpected event network-vif-plugged-9c5c75c6-9099-4d95-91b7-a9c3aedc9d05 for instance with vm_state building and task_state spawning. [ 1367.438525] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Successfully updated port: 9c5c75c6-9099-4d95-91b7-a9c3aedc9d05 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1367.451716] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "refresh_cache-ee34b117-806d-4cc4-98b7-0f40f074cfab" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1367.451873] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired lock "refresh_cache-ee34b117-806d-4cc4-98b7-0f40f074cfab" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1367.452031] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1367.494920] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1367.858257] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Updating instance_info_cache with network_info: [{"id": "9c5c75c6-9099-4d95-91b7-a9c3aedc9d05", "address": "fa:16:3e:58:c5:98", "network": {"id": "8d0d0ce9-0998-4981-ab81-2a7595742174", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-353799566-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "67838ada47314689881a641ad7dcf20e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3739ba33-c119-432c-9aee-80a62864317d", "external-id": "nsx-vlan-transportzone-474", "segmentation_id": 474, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c5c75c6-90", "ovs_interfaceid": "9c5c75c6-9099-4d95-91b7-a9c3aedc9d05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1367.871296] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Releasing lock "refresh_cache-ee34b117-806d-4cc4-98b7-0f40f074cfab" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1367.871525] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Instance network_info: |[{"id": "9c5c75c6-9099-4d95-91b7-a9c3aedc9d05", "address": "fa:16:3e:58:c5:98", "network": {"id": "8d0d0ce9-0998-4981-ab81-2a7595742174", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-353799566-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "67838ada47314689881a641ad7dcf20e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3739ba33-c119-432c-9aee-80a62864317d", "external-id": "nsx-vlan-transportzone-474", "segmentation_id": 474, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c5c75c6-90", "ovs_interfaceid": "9c5c75c6-9099-4d95-91b7-a9c3aedc9d05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1367.871903] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:58:c5:98', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3739ba33-c119-432c-9aee-80a62864317d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9c5c75c6-9099-4d95-91b7-a9c3aedc9d05', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1367.879461] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating folder: Project (67838ada47314689881a641ad7dcf20e). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1367.879983] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6a1bd9e4-fe2f-42e3-904d-1bcca3b63f89 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.891088] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Created folder: Project (67838ada47314689881a641ad7dcf20e) in parent group-v690366. [ 1367.891292] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating folder: Instances. Parent ref: group-v690448. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1367.891512] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1b710a56-b29b-4b40-b313-391e78ceb35a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.900218] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Created folder: Instances in parent group-v690448. [ 1367.900428] env[67964]: DEBUG oslo.service.loopingcall [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1367.900605] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1367.900798] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ba8444c8-4973-45f5-8b34-b3c8ded7456c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.919134] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1367.919134] env[67964]: value = "task-3456821" [ 1367.919134] env[67964]: _type = "Task" [ 1367.919134] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1367.926448] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456821, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1368.033701] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Successfully updated port: cee7104a-b710-4cde-9e16-9d0de09d87e5 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1368.043578] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "refresh_cache-7825ba9e-8603-4211-b5fe-708276272464" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.043724] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired lock "refresh_cache-7825ba9e-8603-4211-b5fe-708276272464" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1368.043862] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1368.095531] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1368.304104] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Updating instance_info_cache with network_info: [{"id": "cee7104a-b710-4cde-9e16-9d0de09d87e5", "address": "fa:16:3e:dc:16:30", "network": {"id": "ffcd87e1-0022-450d-8ac9-578aa689bbc3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130940161-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b28e13db1c6747e9b6c9fef34def6923", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "489b2441-7132-4942-8b61-49cf0ad4400e", "external-id": "nsx-vlan-transportzone-971", "segmentation_id": 971, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcee7104a-b7", "ovs_interfaceid": "cee7104a-b710-4cde-9e16-9d0de09d87e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1368.314653] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Releasing lock "refresh_cache-7825ba9e-8603-4211-b5fe-708276272464" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1368.314946] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Instance network_info: |[{"id": "cee7104a-b710-4cde-9e16-9d0de09d87e5", "address": "fa:16:3e:dc:16:30", "network": {"id": "ffcd87e1-0022-450d-8ac9-578aa689bbc3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130940161-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b28e13db1c6747e9b6c9fef34def6923", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "489b2441-7132-4942-8b61-49cf0ad4400e", "external-id": "nsx-vlan-transportzone-971", "segmentation_id": 971, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcee7104a-b7", "ovs_interfaceid": "cee7104a-b710-4cde-9e16-9d0de09d87e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1368.315338] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:dc:16:30', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '489b2441-7132-4942-8b61-49cf0ad4400e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cee7104a-b710-4cde-9e16-9d0de09d87e5', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1368.322662] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating folder: Project (b28e13db1c6747e9b6c9fef34def6923). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1368.323182] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-33555a37-0445-41c5-aa3d-1b2f9ea76cbb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.333514] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Created folder: Project (b28e13db1c6747e9b6c9fef34def6923) in parent group-v690366. [ 1368.333699] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating folder: Instances. Parent ref: group-v690451. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1368.333926] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3088be06-62c5-4d83-9191-170f34f45d7e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.342689] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Created folder: Instances in parent group-v690451. [ 1368.342907] env[67964]: DEBUG oslo.service.loopingcall [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1368.343100] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1368.343291] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dc46501e-50b8-4341-b339-497ef49949dc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.362528] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1368.362528] env[67964]: value = "task-3456824" [ 1368.362528] env[67964]: _type = "Task" [ 1368.362528] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1368.369745] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456824, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1368.428690] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456821, 'name': CreateVM_Task, 'duration_secs': 0.264077} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1368.429041] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1368.429597] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.429793] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1368.430149] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1368.430545] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8ee7a822-40b5-4e83-91f7-aabf8f6a7900 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.434957] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 1368.434957] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52915094-24bf-b81a-42e2-c546fa62c7cc" [ 1368.434957] env[67964]: _type = "Task" [ 1368.434957] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1368.442342] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52915094-24bf-b81a-42e2-c546fa62c7cc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1368.872877] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456824, 'name': CreateVM_Task, 'duration_secs': 0.290771} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1368.873124] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1368.873711] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.945601] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1368.945857] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1368.946096] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.946365] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1368.946703] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1368.946961] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a29b0633-68e5-4578-8112-86cec2a023d1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.951585] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 1368.951585] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5267518f-b547-bc6f-84f8-df8bf3a2bc0c" [ 1368.951585] env[67964]: _type = "Task" [ 1368.951585] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1368.958725] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5267518f-b547-bc6f-84f8-df8bf3a2bc0c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.223351] env[67964]: DEBUG nova.compute.manager [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Received event network-changed-9c5c75c6-9099-4d95-91b7-a9c3aedc9d05 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1369.223529] env[67964]: DEBUG nova.compute.manager [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Refreshing instance network info cache due to event network-changed-9c5c75c6-9099-4d95-91b7-a9c3aedc9d05. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1369.223815] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Acquiring lock "refresh_cache-ee34b117-806d-4cc4-98b7-0f40f074cfab" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.223815] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Acquired lock "refresh_cache-ee34b117-806d-4cc4-98b7-0f40f074cfab" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.224058] env[67964]: DEBUG nova.network.neutron [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Refreshing network info cache for port 9c5c75c6-9099-4d95-91b7-a9c3aedc9d05 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1369.461847] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1369.462122] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1369.462345] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.782125] env[67964]: DEBUG nova.network.neutron [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Updated VIF entry in instance network info cache for port 9c5c75c6-9099-4d95-91b7-a9c3aedc9d05. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1369.782125] env[67964]: DEBUG nova.network.neutron [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Updating instance_info_cache with network_info: [{"id": "9c5c75c6-9099-4d95-91b7-a9c3aedc9d05", "address": "fa:16:3e:58:c5:98", "network": {"id": "8d0d0ce9-0998-4981-ab81-2a7595742174", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-353799566-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "67838ada47314689881a641ad7dcf20e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3739ba33-c119-432c-9aee-80a62864317d", "external-id": "nsx-vlan-transportzone-474", "segmentation_id": 474, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c5c75c6-90", "ovs_interfaceid": "9c5c75c6-9099-4d95-91b7-a9c3aedc9d05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1369.791208] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Releasing lock "refresh_cache-ee34b117-806d-4cc4-98b7-0f40f074cfab" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1369.791272] env[67964]: DEBUG nova.compute.manager [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Received event network-vif-plugged-cee7104a-b710-4cde-9e16-9d0de09d87e5 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1369.791463] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Acquiring lock "7825ba9e-8603-4211-b5fe-708276272464-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1369.791659] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Lock "7825ba9e-8603-4211-b5fe-708276272464-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1369.791815] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Lock "7825ba9e-8603-4211-b5fe-708276272464-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1369.791974] env[67964]: DEBUG nova.compute.manager [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] No waiting events found dispatching network-vif-plugged-cee7104a-b710-4cde-9e16-9d0de09d87e5 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1369.792152] env[67964]: WARNING nova.compute.manager [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Received unexpected event network-vif-plugged-cee7104a-b710-4cde-9e16-9d0de09d87e5 for instance with vm_state building and task_state spawning. [ 1369.792313] env[67964]: DEBUG nova.compute.manager [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Received event network-changed-cee7104a-b710-4cde-9e16-9d0de09d87e5 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1369.792462] env[67964]: DEBUG nova.compute.manager [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Refreshing instance network info cache due to event network-changed-cee7104a-b710-4cde-9e16-9d0de09d87e5. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1369.792638] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Acquiring lock "refresh_cache-7825ba9e-8603-4211-b5fe-708276272464" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.792773] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Acquired lock "refresh_cache-7825ba9e-8603-4211-b5fe-708276272464" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.792921] env[67964]: DEBUG nova.network.neutron [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Refreshing network info cache for port cee7104a-b710-4cde-9e16-9d0de09d87e5 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1370.081807] env[67964]: DEBUG nova.network.neutron [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Updated VIF entry in instance network info cache for port cee7104a-b710-4cde-9e16-9d0de09d87e5. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1370.082217] env[67964]: DEBUG nova.network.neutron [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Updating instance_info_cache with network_info: [{"id": "cee7104a-b710-4cde-9e16-9d0de09d87e5", "address": "fa:16:3e:dc:16:30", "network": {"id": "ffcd87e1-0022-450d-8ac9-578aa689bbc3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130940161-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b28e13db1c6747e9b6c9fef34def6923", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "489b2441-7132-4942-8b61-49cf0ad4400e", "external-id": "nsx-vlan-transportzone-971", "segmentation_id": 971, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcee7104a-b7", "ovs_interfaceid": "cee7104a-b710-4cde-9e16-9d0de09d87e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1370.091557] env[67964]: DEBUG oslo_concurrency.lockutils [req-7451a0ef-ed31-4f56-8563-79291f9fe347 req-2ef698f1-2453-409b-9263-ccc0526f1a91 service nova] Releasing lock "refresh_cache-7825ba9e-8603-4211-b5fe-708276272464" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1380.121068] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "ea5f3d40-6494-459a-a917-2602d0718d8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1380.121455] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.465500] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1383.465939] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1383.466352] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1383.477157] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1383.477359] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.477512] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1383.477659] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1383.478731] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51206e02-1e83-4ab5-b6d9-f4f34530aba2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.487237] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ec8a17-354e-4356-b93e-4ceea551abc2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.500334] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-615af75b-a130-46f9-9b84-5c120756385a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.506445] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d8a2923-df2c-4cb5-a8ba-18f44bc2310f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.536255] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180926MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1383.536255] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1383.536255] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.608505] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609085] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609085] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609085] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609085] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609263] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609263] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609382] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609489] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.609604] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.623209] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.636305] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance fb025130-d995-4615-8dee-59af1700877f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.648287] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 60cd7925-3124-449e-8d27-4faa7b27cb9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.658212] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.688482] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0f126555-f26e-42da-a468-28a28887c901 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.701867] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 0228456f-0055-43b9-9a81-e0f031e2a549 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.712020] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance a0908e14-521d-42c1-baaa-b5863b1f142d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.721774] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d64969c7-d467-4958-8b04-aa2d2920769a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.731151] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.731375] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1383.731514] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1383.927192] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92056f55-2772-4bba-8e52-d360a193b69c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.934950] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a897c844-cfc9-4b8e-8733-41d52ade885c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.963807] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eec24e81-d98f-437d-b6e5-095df92ba26b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.971088] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b529ae2-cca1-413f-9534-a553ed7916b3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.983997] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1383.996190] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1384.009368] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1384.009557] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.475s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.344794] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1384.800311] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1386.402451] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "18d6df82-a19a-499a-8874-171218569651" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1386.800824] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1386.800824] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1388.795791] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1389.801044] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1390.800663] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1390.800865] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1390.800987] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1390.823217] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.823539] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.823539] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.823678] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.823798] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.823927] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.824042] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.824163] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18d6df82-a19a-499a-8874-171218569651] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.824280] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.824394] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1390.824510] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1392.580563] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1394.137507] env[67964]: DEBUG oslo_concurrency.lockutils [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "7825ba9e-8603-4211-b5fe-708276272464" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1398.489810] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1398.489810] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.819469] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1402.102376] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1402.102677] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1410.417047] env[67964]: WARNING oslo_vmware.rw_handles [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1410.417047] env[67964]: ERROR oslo_vmware.rw_handles [ 1410.417712] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1410.419753] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1410.420046] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Copying Virtual Disk [datastore1] vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/cee42da6-1aea-445c-a566-d6b26e9b370a/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1410.420327] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-28d58d7b-0644-4309-876a-f74f7c55e216 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.427847] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 1410.427847] env[67964]: value = "task-3456825" [ 1410.427847] env[67964]: _type = "Task" [ 1410.427847] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1410.436419] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456825, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1410.938312] env[67964]: DEBUG oslo_vmware.exceptions [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1410.938619] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1410.939195] env[67964]: ERROR nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1410.939195] env[67964]: Faults: ['InvalidArgument'] [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Traceback (most recent call last): [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] yield resources [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self.driver.spawn(context, instance, image_meta, [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self._fetch_image_if_missing(context, vi) [ 1410.939195] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] image_cache(vi, tmp_image_ds_loc) [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] vm_util.copy_virtual_disk( [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] session._wait_for_task(vmdk_copy_task) [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] return self.wait_for_task(task_ref) [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] return evt.wait() [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] result = hub.switch() [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1410.939549] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] return self.greenlet.switch() [ 1410.939888] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1410.939888] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self.f(*self.args, **self.kw) [ 1410.939888] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1410.939888] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] raise exceptions.translate_fault(task_info.error) [ 1410.939888] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1410.939888] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Faults: ['InvalidArgument'] [ 1410.939888] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] [ 1410.939888] env[67964]: INFO nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Terminating instance [ 1410.941049] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1410.941255] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1410.941855] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1410.942119] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1410.942354] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-af5c3d75-8a0b-4123-829b-27e485e291a3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.945170] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-574e1b15-991b-4abe-9b1e-d753f472b90f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.953287] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1410.954257] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7ec3faf0-362b-404c-bc7d-625b21c38466 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.955610] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1410.955781] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1410.956459] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1ca3ddc6-599a-4180-80b5-8054e8d0edb4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.961364] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Waiting for the task: (returnval){ [ 1410.961364] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5296ba31-e7f8-9d02-8db6-1718f65b8ee1" [ 1410.961364] env[67964]: _type = "Task" [ 1410.961364] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1410.968458] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5296ba31-e7f8-9d02-8db6-1718f65b8ee1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1411.019719] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1411.019910] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1411.020060] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleting the datastore file [datastore1] 9793d383-9033-4f86-b7bb-6b2e43347cd6 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1411.020326] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b757a768-ee5b-4895-b2b5-c0521fba8912 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.027155] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 1411.027155] env[67964]: value = "task-3456827" [ 1411.027155] env[67964]: _type = "Task" [ 1411.027155] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1411.034623] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456827, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1411.471315] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1411.471619] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Creating directory with path [datastore1] vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1411.471796] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b30e7325-83f5-448f-89ee-603457b55f4f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.483254] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Created directory with path [datastore1] vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1411.483444] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Fetch image to [datastore1] vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1411.483640] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1411.484374] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58e1a568-4143-49af-be2b-6435cd8d8e46 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.491043] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fe3bbf5-71f2-4140-adf4-76119b666564 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.500080] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc689395-2905-4824-95bb-6bfe2bb1b52f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.533613] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c31ec53-07d8-4d01-b416-f985e613eb6a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.540564] env[67964]: DEBUG oslo_vmware.api [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456827, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075538} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1411.542038] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1411.542223] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1411.542385] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1411.542549] env[67964]: INFO nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1411.544280] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-58eb967a-1d39-45ad-b122-cb821146b9bf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.546277] env[67964]: DEBUG nova.compute.claims [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1411.546357] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1411.546574] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.566515] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1411.620721] env[67964]: DEBUG oslo_vmware.rw_handles [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1411.679998] env[67964]: DEBUG oslo_vmware.rw_handles [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1411.679998] env[67964]: DEBUG oslo_vmware.rw_handles [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1411.848485] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea835642-ada1-46c1-a857-d97f2246161f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.855433] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-692008a8-c3a8-4021-8f28-85e641533086 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.883931] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a84f83b-0fa2-41f1-a4f5-1bca7d4df962 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.891149] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ba1d581-d3ee-40b8-a7de-bebc5b0e69e6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.904097] env[67964]: DEBUG nova.compute.provider_tree [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1411.914572] env[67964]: DEBUG nova.scheduler.client.report [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1411.929577] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.383s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.930162] env[67964]: ERROR nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1411.930162] env[67964]: Faults: ['InvalidArgument'] [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Traceback (most recent call last): [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self.driver.spawn(context, instance, image_meta, [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self._fetch_image_if_missing(context, vi) [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] image_cache(vi, tmp_image_ds_loc) [ 1411.930162] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] vm_util.copy_virtual_disk( [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] session._wait_for_task(vmdk_copy_task) [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] return self.wait_for_task(task_ref) [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] return evt.wait() [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] result = hub.switch() [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] return self.greenlet.switch() [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1411.930493] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] self.f(*self.args, **self.kw) [ 1411.930814] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1411.930814] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] raise exceptions.translate_fault(task_info.error) [ 1411.930814] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1411.930814] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Faults: ['InvalidArgument'] [ 1411.930814] env[67964]: ERROR nova.compute.manager [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] [ 1411.930937] env[67964]: DEBUG nova.compute.utils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1411.932202] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Build of instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 was re-scheduled: A specified parameter was not correct: fileType [ 1411.932202] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1411.932573] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1411.932747] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1411.932914] env[67964]: DEBUG nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1411.933094] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1412.246726] env[67964]: DEBUG nova.network.neutron [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1412.257031] env[67964]: INFO nova.compute.manager [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Took 0.32 seconds to deallocate network for instance. [ 1412.363762] env[67964]: INFO nova.scheduler.client.report [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleted allocations for instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 [ 1412.383581] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a2d2f3f6-7b04-4318-8ea4-da8528bb4dfd tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 601.040s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.385228] env[67964]: DEBUG oslo_concurrency.lockutils [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 404.757s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1412.385477] env[67964]: DEBUG oslo_concurrency.lockutils [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "9793d383-9033-4f86-b7bb-6b2e43347cd6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1412.385687] env[67964]: DEBUG oslo_concurrency.lockutils [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1412.385855] env[67964]: DEBUG oslo_concurrency.lockutils [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.388042] env[67964]: INFO nova.compute.manager [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Terminating instance [ 1412.390427] env[67964]: DEBUG nova.compute.manager [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1412.390625] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1412.390887] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-af2e54e6-69d6-4bf6-9882-595804419d31 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.400797] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a435877-61c4-4d7f-8b2d-90c4ef245894 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.412379] env[67964]: DEBUG nova.compute.manager [None req-530a2849-71f7-4865-a1c6-446fda5b7ea7 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1412.433936] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9793d383-9033-4f86-b7bb-6b2e43347cd6 could not be found. [ 1412.434506] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1412.434506] env[67964]: INFO nova.compute.manager [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1412.434628] env[67964]: DEBUG oslo.service.loopingcall [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1412.434754] env[67964]: DEBUG nova.compute.manager [-] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1412.435222] env[67964]: DEBUG nova.network.neutron [-] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1412.438023] env[67964]: DEBUG nova.compute.manager [None req-530a2849-71f7-4865-a1c6-446fda5b7ea7 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1412.460377] env[67964]: DEBUG oslo_concurrency.lockutils [None req-530a2849-71f7-4865-a1c6-446fda5b7ea7 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "cc4cdc79-2620-42c6-bf3d-0b108a2cbfe0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.234s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.465531] env[67964]: DEBUG nova.network.neutron [-] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1412.472705] env[67964]: DEBUG nova.compute.manager [None req-be1d3380-6677-4de9-a32e-a3485a81bc8d tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: fb025130-d995-4615-8dee-59af1700877f] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1412.475354] env[67964]: INFO nova.compute.manager [-] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] Took 0.04 seconds to deallocate network for instance. [ 1412.497396] env[67964]: DEBUG nova.compute.manager [None req-be1d3380-6677-4de9-a32e-a3485a81bc8d tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: fb025130-d995-4615-8dee-59af1700877f] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1412.521402] env[67964]: DEBUG oslo_concurrency.lockutils [None req-be1d3380-6677-4de9-a32e-a3485a81bc8d tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "fb025130-d995-4615-8dee-59af1700877f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.024s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.533096] env[67964]: DEBUG nova.compute.manager [None req-5a3516b0-0f68-4acd-9a5d-02535f0c84bc tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] [instance: 60cd7925-3124-449e-8d27-4faa7b27cb9c] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1412.558389] env[67964]: DEBUG nova.compute.manager [None req-5a3516b0-0f68-4acd-9a5d-02535f0c84bc tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] [instance: 60cd7925-3124-449e-8d27-4faa7b27cb9c] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1412.585012] env[67964]: DEBUG oslo_concurrency.lockutils [None req-13c83e84-246b-4204-9e46-2871744e0170 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.585785] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 252.867s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1412.585964] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9793d383-9033-4f86-b7bb-6b2e43347cd6] During sync_power_state the instance has a pending task (deleting). Skip. [ 1412.586147] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "9793d383-9033-4f86-b7bb-6b2e43347cd6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.598921] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5a3516b0-0f68-4acd-9a5d-02535f0c84bc tempest-ServersTestMultiNic-682379730 tempest-ServersTestMultiNic-682379730-project-member] Lock "60cd7925-3124-449e-8d27-4faa7b27cb9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.900s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.608299] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1412.656542] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1412.656813] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1412.658248] env[67964]: INFO nova.compute.claims [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1412.883723] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0312aea1-61c6-42db-80fa-85a7e2988e46 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.891672] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3a3c7c6-8cbb-44cb-857b-b4c175bf70a0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.921156] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09d52ff9-cfeb-436b-aa2a-f1ef67fb7bf4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.928340] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcf85546-b36b-4d5a-a03f-bb8c8159caa3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.941528] env[67964]: DEBUG nova.compute.provider_tree [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1412.950480] env[67964]: DEBUG nova.scheduler.client.report [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1412.966292] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.966768] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1413.003200] env[67964]: DEBUG nova.compute.utils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1413.004503] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1413.004676] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1413.015110] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1413.068059] env[67964]: DEBUG nova.policy [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93750f928d0f49a4bb2ee941ac103453', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab7190851127465491dd4808bbcc3e87', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1413.075124] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1413.099752] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1413.099988] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1413.100350] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1413.100350] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1413.100464] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1413.100597] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1413.100802] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1413.100951] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1413.101126] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1413.101445] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1413.101445] env[67964]: DEBUG nova.virt.hardware [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1413.102336] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8052d20-85c3-4e36-b2d2-da5d0510ed08 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1413.110553] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d57616bc-aad8-4b80-acf4-85901748671d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1413.445112] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Successfully created port: 3857960f-41e5-4204-80c1-c5945a4037cf {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1414.174359] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Successfully updated port: 3857960f-41e5-4204-80c1-c5945a4037cf {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1414.185341] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "refresh_cache-ec783231-6f62-4177-ba76-4ba688dda077" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1414.185479] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquired lock "refresh_cache-ec783231-6f62-4177-ba76-4ba688dda077" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1414.185622] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1414.227643] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1414.284242] env[67964]: DEBUG nova.compute.manager [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Received event network-vif-plugged-3857960f-41e5-4204-80c1-c5945a4037cf {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1414.284461] env[67964]: DEBUG oslo_concurrency.lockutils [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] Acquiring lock "ec783231-6f62-4177-ba76-4ba688dda077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1414.284949] env[67964]: DEBUG oslo_concurrency.lockutils [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] Lock "ec783231-6f62-4177-ba76-4ba688dda077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1414.285173] env[67964]: DEBUG oslo_concurrency.lockutils [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] Lock "ec783231-6f62-4177-ba76-4ba688dda077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1414.285349] env[67964]: DEBUG nova.compute.manager [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] No waiting events found dispatching network-vif-plugged-3857960f-41e5-4204-80c1-c5945a4037cf {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1414.285512] env[67964]: WARNING nova.compute.manager [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Received unexpected event network-vif-plugged-3857960f-41e5-4204-80c1-c5945a4037cf for instance with vm_state building and task_state spawning. [ 1414.285675] env[67964]: DEBUG nova.compute.manager [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Received event network-changed-3857960f-41e5-4204-80c1-c5945a4037cf {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1414.285828] env[67964]: DEBUG nova.compute.manager [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Refreshing instance network info cache due to event network-changed-3857960f-41e5-4204-80c1-c5945a4037cf. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1414.285996] env[67964]: DEBUG oslo_concurrency.lockutils [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] Acquiring lock "refresh_cache-ec783231-6f62-4177-ba76-4ba688dda077" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1414.385195] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Updating instance_info_cache with network_info: [{"id": "3857960f-41e5-4204-80c1-c5945a4037cf", "address": "fa:16:3e:22:c4:27", "network": {"id": "6947bccf-ee92-46d1-bbf7-15a6920e1b0c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-124553213-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab7190851127465491dd4808bbcc3e87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3857960f-41", "ovs_interfaceid": "3857960f-41e5-4204-80c1-c5945a4037cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1414.397844] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Releasing lock "refresh_cache-ec783231-6f62-4177-ba76-4ba688dda077" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1414.398149] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Instance network_info: |[{"id": "3857960f-41e5-4204-80c1-c5945a4037cf", "address": "fa:16:3e:22:c4:27", "network": {"id": "6947bccf-ee92-46d1-bbf7-15a6920e1b0c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-124553213-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab7190851127465491dd4808bbcc3e87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3857960f-41", "ovs_interfaceid": "3857960f-41e5-4204-80c1-c5945a4037cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1414.398491] env[67964]: DEBUG oslo_concurrency.lockutils [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] Acquired lock "refresh_cache-ec783231-6f62-4177-ba76-4ba688dda077" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1414.398677] env[67964]: DEBUG nova.network.neutron [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Refreshing network info cache for port 3857960f-41e5-4204-80c1-c5945a4037cf {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1414.400052] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:22:c4:27', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae18b41f-e73c-44f1-83dd-467c080944f4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3857960f-41e5-4204-80c1-c5945a4037cf', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1414.407256] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Creating folder: Project (ab7190851127465491dd4808bbcc3e87). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1414.408033] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-843a9a56-ed3b-47be-8019-168fbc836652 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.420767] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Created folder: Project (ab7190851127465491dd4808bbcc3e87) in parent group-v690366. [ 1414.421087] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Creating folder: Instances. Parent ref: group-v690454. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1414.421087] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f8d0fbba-7e29-410e-a38d-4c259a2dcb71 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.430160] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Created folder: Instances in parent group-v690454. [ 1414.430160] env[67964]: DEBUG oslo.service.loopingcall [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1414.430160] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1414.430302] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ff512534-b445-433f-a9f8-c5d178e4ec24 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.447794] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1414.447794] env[67964]: value = "task-3456830" [ 1414.447794] env[67964]: _type = "Task" [ 1414.447794] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.454989] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456830, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1414.904773] env[67964]: DEBUG nova.network.neutron [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Updated VIF entry in instance network info cache for port 3857960f-41e5-4204-80c1-c5945a4037cf. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1414.905155] env[67964]: DEBUG nova.network.neutron [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Updating instance_info_cache with network_info: [{"id": "3857960f-41e5-4204-80c1-c5945a4037cf", "address": "fa:16:3e:22:c4:27", "network": {"id": "6947bccf-ee92-46d1-bbf7-15a6920e1b0c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-124553213-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab7190851127465491dd4808bbcc3e87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3857960f-41", "ovs_interfaceid": "3857960f-41e5-4204-80c1-c5945a4037cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1414.915369] env[67964]: DEBUG oslo_concurrency.lockutils [req-66e76351-1652-4a18-97e1-83c45030fe45 req-7686b15a-247c-4cb1-8923-f36220f0b076 service nova] Releasing lock "refresh_cache-ec783231-6f62-4177-ba76-4ba688dda077" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1414.957493] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456830, 'name': CreateVM_Task, 'duration_secs': 0.28513} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1414.957647] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1414.958341] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1414.958558] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1414.958872] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1414.959134] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1ac478f7-c3ed-450b-8cea-aa07a0428f21 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.963637] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Waiting for the task: (returnval){ [ 1414.963637] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ba38b1-df18-df90-69e3-75341eabc108" [ 1414.963637] env[67964]: _type = "Task" [ 1414.963637] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.971188] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ba38b1-df18-df90-69e3-75341eabc108, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1415.473802] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1415.474136] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1415.474268] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1417.790837] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1417.791124] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.450954] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "ec783231-6f62-4177-ba76-4ba688dda077" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1434.801317] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1434.801692] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 1434.811411] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] There are 0 instances to clean {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 1442.810443] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1442.822400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1442.822615] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1442.822778] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1442.822937] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1442.824055] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f1c2040-ba9a-428b-b709-19caf075a788 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1442.833022] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fa64a65-1e4a-42a4-9954-415d863c0486 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1442.847271] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a8c8a43-b20e-4403-926f-dc7d882de25f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1442.853726] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef6cb584-1214-40f1-95ad-35ba0eae70f6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1442.884381] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180927MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1442.884435] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1442.884623] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1442.962134] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.962378] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.962459] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.962549] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.962664] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.962781] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.962897] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.963062] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.963218] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.963336] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1442.975682] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance a0908e14-521d-42c1-baaa-b5863b1f142d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1442.988286] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d64969c7-d467-4958-8b04-aa2d2920769a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1442.998812] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1443.009795] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1443.022351] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1443.032582] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1443.032814] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1443.032965] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1443.219056] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-380275da-5f3f-45ce-9983-6df65336883c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1443.226562] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae311bcf-f9fc-4882-8c8b-19a7fde4344e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1443.255837] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f9f8bb6-0004-46b5-841b-44b23d816a07 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1443.262535] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f9e3527-495b-4819-b3e7-aef92d46cbd8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1443.275052] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1443.284051] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1443.301751] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1443.301930] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.417s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1444.292457] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1444.801285] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1444.801285] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1445.801425] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1446.800220] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1447.800988] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1447.801329] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1448.803591] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1450.800593] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1451.530379] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bbfcd055-21e0-44fb-bbed-dbe19ce90b79 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "76ddefb8-a93f-483a-9487-bc05f5dfef3f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1451.530606] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bbfcd055-21e0-44fb-bbed-dbe19ce90b79 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "76ddefb8-a93f-483a-9487-bc05f5dfef3f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1451.800885] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1451.801294] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1451.801294] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1451.823008] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.823382] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.823449] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.823595] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.823692] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.823812] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.823929] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18d6df82-a19a-499a-8874-171218569651] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.824059] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.824178] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.824328] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1451.824402] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1453.800259] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1453.800643] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances with incomplete migration {{(pid=67964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 1457.641085] env[67964]: WARNING oslo_vmware.rw_handles [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1457.641085] env[67964]: ERROR oslo_vmware.rw_handles [ 1457.641661] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1457.643304] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1457.643544] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Copying Virtual Disk [datastore1] vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/1afe579a-f0ff-4296-bcec-9ab3d43269e5/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1457.643828] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-83cecd6e-5e8f-47ae-9f3d-6f94b360470e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.652590] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Waiting for the task: (returnval){ [ 1457.652590] env[67964]: value = "task-3456831" [ 1457.652590] env[67964]: _type = "Task" [ 1457.652590] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1457.660181] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Task: {'id': task-3456831, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1458.162657] env[67964]: DEBUG oslo_vmware.exceptions [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1458.162934] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1458.163500] env[67964]: ERROR nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1458.163500] env[67964]: Faults: ['InvalidArgument'] [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Traceback (most recent call last): [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] yield resources [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self.driver.spawn(context, instance, image_meta, [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self._fetch_image_if_missing(context, vi) [ 1458.163500] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] image_cache(vi, tmp_image_ds_loc) [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] vm_util.copy_virtual_disk( [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] session._wait_for_task(vmdk_copy_task) [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] return self.wait_for_task(task_ref) [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] return evt.wait() [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] result = hub.switch() [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1458.163870] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] return self.greenlet.switch() [ 1458.164268] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1458.164268] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self.f(*self.args, **self.kw) [ 1458.164268] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1458.164268] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] raise exceptions.translate_fault(task_info.error) [ 1458.164268] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1458.164268] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Faults: ['InvalidArgument'] [ 1458.164268] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] [ 1458.164268] env[67964]: INFO nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Terminating instance [ 1458.165400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1458.165606] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1458.165833] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3b523098-b468-4fa2-b31e-e7b91734875b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.168610] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1458.168939] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1458.169887] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d1cf4eb-b0dd-4d17-b6e2-921b225789bc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.176579] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1458.176785] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c8f6efc5-ae1b-43b6-93d9-c49925949821 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.178860] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1458.179081] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1458.180014] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2b1c114d-10af-453f-9d34-7e20a41a1e2e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.184531] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Waiting for the task: (returnval){ [ 1458.184531] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5206af10-cde6-3a41-7695-974db8cdd7cf" [ 1458.184531] env[67964]: _type = "Task" [ 1458.184531] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1458.191467] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5206af10-cde6-3a41-7695-974db8cdd7cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1458.241306] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1458.241560] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1458.241735] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Deleting the datastore file [datastore1] 5fbee4c3-bc7c-4582-b976-b0d619a69cdb {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1458.241998] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-84cb8a36-56cf-4607-954e-8a8a0d0dc611 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.248085] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Waiting for the task: (returnval){ [ 1458.248085] env[67964]: value = "task-3456833" [ 1458.248085] env[67964]: _type = "Task" [ 1458.248085] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1458.255550] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Task: {'id': task-3456833, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1458.694513] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1458.694780] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Creating directory with path [datastore1] vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1458.695030] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1370117a-3470-4c8c-bba0-7981f216ef31 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.706441] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Created directory with path [datastore1] vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1458.706638] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Fetch image to [datastore1] vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1458.706809] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1458.707617] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83c48377-0bb5-47db-beee-e7e64c8594ab {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.714458] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63197132-17d9-4253-8cc2-141232704085 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.724547] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ca0da1-0b0e-41f9-b40e-f7adaa21ee25 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.757437] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d99afb2-c2c7-4798-bb62-cd67cfd9a98f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.764213] env[67964]: DEBUG oslo_vmware.api [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Task: {'id': task-3456833, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075383} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1458.765613] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1458.765802] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1458.765969] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1458.766154] env[67964]: INFO nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1458.767906] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b44c0bf4-2f29-49a3-af1e-f0f92644521c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.769760] env[67964]: DEBUG nova.compute.claims [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1458.769926] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.770163] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1458.795138] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1458.848648] env[67964]: DEBUG oslo_vmware.rw_handles [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1458.908693] env[67964]: DEBUG oslo_vmware.rw_handles [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1458.908879] env[67964]: DEBUG oslo_vmware.rw_handles [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1459.049741] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a70e86bd-cda3-4c19-957a-e9f14610715f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.057655] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33d5517f-f2ad-4bbc-862f-8260f64a761a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.087191] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f32ebbc-a5b8-4ad4-bda2-e62026c66d21 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.093849] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a351641c-e305-4648-9268-d0b246a10beb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.106735] env[67964]: DEBUG nova.compute.provider_tree [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1459.114918] env[67964]: DEBUG nova.scheduler.client.report [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1459.129585] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.359s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.129980] env[67964]: ERROR nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1459.129980] env[67964]: Faults: ['InvalidArgument'] [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Traceback (most recent call last): [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self.driver.spawn(context, instance, image_meta, [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self._fetch_image_if_missing(context, vi) [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] image_cache(vi, tmp_image_ds_loc) [ 1459.129980] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] vm_util.copy_virtual_disk( [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] session._wait_for_task(vmdk_copy_task) [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] return self.wait_for_task(task_ref) [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] return evt.wait() [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] result = hub.switch() [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] return self.greenlet.switch() [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1459.130341] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] self.f(*self.args, **self.kw) [ 1459.130695] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1459.130695] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] raise exceptions.translate_fault(task_info.error) [ 1459.130695] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1459.130695] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Faults: ['InvalidArgument'] [ 1459.130695] env[67964]: ERROR nova.compute.manager [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] [ 1459.130695] env[67964]: DEBUG nova.compute.utils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1459.132071] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Build of instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb was re-scheduled: A specified parameter was not correct: fileType [ 1459.132071] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1459.132440] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1459.132611] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1459.132775] env[67964]: DEBUG nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1459.132933] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1459.531161] env[67964]: DEBUG nova.network.neutron [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1459.541523] env[67964]: INFO nova.compute.manager [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Took 0.41 seconds to deallocate network for instance. [ 1459.629800] env[67964]: INFO nova.scheduler.client.report [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Deleted allocations for instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb [ 1459.650793] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9bd0fe25-ffa5-4840-94f3-984a420fe43c tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 626.127s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.652408] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.175s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.652633] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Acquiring lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1459.652834] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.653008] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.654977] env[67964]: INFO nova.compute.manager [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Terminating instance [ 1459.656555] env[67964]: DEBUG nova.compute.manager [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1459.656741] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1459.657227] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2a30c262-ec40-4743-9086-7826eb23b07f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.661855] env[67964]: DEBUG nova.compute.manager [None req-fbf6107f-c6e6-4dca-a4dd-58e7adfb9f53 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] [instance: 0f126555-f26e-42da-a468-28a28887c901] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1459.668126] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73ae5de7-1d11-463e-a95e-902f4ed369d8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.685181] env[67964]: DEBUG nova.compute.manager [None req-fbf6107f-c6e6-4dca-a4dd-58e7adfb9f53 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] [instance: 0f126555-f26e-42da-a468-28a28887c901] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1459.697248] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5fbee4c3-bc7c-4582-b976-b0d619a69cdb could not be found. [ 1459.697496] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1459.697661] env[67964]: INFO nova.compute.manager [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1459.697993] env[67964]: DEBUG oslo.service.loopingcall [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1459.698694] env[67964]: DEBUG nova.compute.manager [-] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1459.698801] env[67964]: DEBUG nova.network.neutron [-] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1459.711117] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fbf6107f-c6e6-4dca-a4dd-58e7adfb9f53 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] Lock "0f126555-f26e-42da-a468-28a28887c901" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.266s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.720938] env[67964]: DEBUG nova.compute.manager [None req-9a8db661-4469-40ee-abe6-9f08886943f1 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] [instance: 0228456f-0055-43b9-9a81-e0f031e2a549] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1459.724842] env[67964]: DEBUG nova.network.neutron [-] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1459.732694] env[67964]: INFO nova.compute.manager [-] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] Took 0.03 seconds to deallocate network for instance. [ 1459.749951] env[67964]: DEBUG nova.compute.manager [None req-9a8db661-4469-40ee-abe6-9f08886943f1 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] [instance: 0228456f-0055-43b9-9a81-e0f031e2a549] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1459.774109] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9a8db661-4469-40ee-abe6-9f08886943f1 tempest-ServerShowV247Test-811162969 tempest-ServerShowV247Test-811162969-project-member] Lock "0228456f-0055-43b9-9a81-e0f031e2a549" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.872s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.785209] env[67964]: DEBUG nova.compute.manager [None req-b9df27ba-4726-4edc-809c-64e894beadff tempest-ServerMetadataNegativeTestJSON-1124299271 tempest-ServerMetadataNegativeTestJSON-1124299271-project-member] [instance: a0908e14-521d-42c1-baaa-b5863b1f142d] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1459.809169] env[67964]: DEBUG nova.compute.manager [None req-b9df27ba-4726-4edc-809c-64e894beadff tempest-ServerMetadataNegativeTestJSON-1124299271 tempest-ServerMetadataNegativeTestJSON-1124299271-project-member] [instance: a0908e14-521d-42c1-baaa-b5863b1f142d] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1459.832930] env[67964]: DEBUG oslo_concurrency.lockutils [None req-f14fe89e-ab67-4664-87b4-e959748abdab tempest-ServersTestJSON-1119709012 tempest-ServersTestJSON-1119709012-project-member] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.834296] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 300.115s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.834296] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 5fbee4c3-bc7c-4582-b976-b0d619a69cdb] During sync_power_state the instance has a pending task (deleting). Skip. [ 1459.834296] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "5fbee4c3-bc7c-4582-b976-b0d619a69cdb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.838323] env[67964]: DEBUG oslo_concurrency.lockutils [None req-b9df27ba-4726-4edc-809c-64e894beadff tempest-ServerMetadataNegativeTestJSON-1124299271 tempest-ServerMetadataNegativeTestJSON-1124299271-project-member] Lock "a0908e14-521d-42c1-baaa-b5863b1f142d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.725s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.848811] env[67964]: DEBUG nova.compute.manager [None req-2fdf58fe-e78e-4265-a4b0-303fc616d9d4 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: d64969c7-d467-4958-8b04-aa2d2920769a] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1459.871714] env[67964]: DEBUG nova.compute.manager [None req-2fdf58fe-e78e-4265-a4b0-303fc616d9d4 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: d64969c7-d467-4958-8b04-aa2d2920769a] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1459.892191] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2fdf58fe-e78e-4265-a4b0-303fc616d9d4 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "d64969c7-d467-4958-8b04-aa2d2920769a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.756s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.901289] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1459.951938] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1459.952243] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.954088] env[67964]: INFO nova.compute.claims [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1460.173849] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07df3439-b8da-4ecb-a153-ef1cc5915d61 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.181901] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e68e35-f218-4d19-a98f-62b925422bec {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.215535] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a90234ab-b424-47d0-8ab9-4563d3a89f34 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.223293] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2091e2e7-d553-40a2-85d1-3565154bad46 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.236548] env[67964]: DEBUG nova.compute.provider_tree [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1460.246715] env[67964]: DEBUG nova.scheduler.client.report [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1460.260058] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1460.260516] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1460.290815] env[67964]: DEBUG nova.compute.utils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1460.292469] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Not allocating networking since 'none' was specified. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 1460.301170] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1460.361739] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1460.386248] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1460.386513] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1460.386669] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1460.386858] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1460.387038] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1460.387196] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1460.387399] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1460.387553] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1460.387714] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1460.387874] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1460.388057] env[67964]: DEBUG nova.virt.hardware [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1460.388902] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-217da4a5-0def-49ca-8b73-f264b908eb31 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.396701] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b655cdbf-4561-4922-90fc-242937765348 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.409950] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance VIF info [] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1460.415395] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Creating folder: Project (5b6971af443a498897d511e42ad0b9d8). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1460.415643] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b66f14d7-febf-4ae6-896c-23d53603464c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.425699] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Created folder: Project (5b6971af443a498897d511e42ad0b9d8) in parent group-v690366. [ 1460.425871] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Creating folder: Instances. Parent ref: group-v690457. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1460.426097] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4201c528-2925-4a52-8905-637bd1e831c1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.433950] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Created folder: Instances in parent group-v690457. [ 1460.434186] env[67964]: DEBUG oslo.service.loopingcall [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1460.434363] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1460.434543] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1da83919-f0d0-4866-a7c0-8ab2976eb193 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.449412] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1460.449412] env[67964]: value = "task-3456836" [ 1460.449412] env[67964]: _type = "Task" [ 1460.449412] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1460.456077] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456836, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1460.959711] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456836, 'name': CreateVM_Task, 'duration_secs': 0.23991} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1460.960042] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1460.960367] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1460.960529] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1460.960859] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1460.961130] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-baeafd09-ad9c-47b6-b0ef-2d9387a745fa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.965386] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Waiting for the task: (returnval){ [ 1460.965386] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]521d03ca-e7d6-ab9e-09aa-17296fdce3a5" [ 1460.965386] env[67964]: _type = "Task" [ 1460.965386] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1460.972928] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]521d03ca-e7d6-ab9e-09aa-17296fdce3a5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1461.475257] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1461.475520] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1461.475734] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1502.810284] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1502.822099] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1502.822304] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1502.822471] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1502.822627] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1502.823747] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d660feb9-30b5-4402-b14a-33a2a5ae839b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.832534] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85513304-fedb-4f26-b2c7-7237c929e00b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.845859] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca2a04b-f901-478c-92a3-b4d59038169b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.851992] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e5fc3c-b647-4e80-ad2d-bfca0736018d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.882293] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180882MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1502.882486] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1502.882640] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1503.025049] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 67eb58c3-a895-4427-9197-3b0c731a123a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025049] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025049] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025049] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025272] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025272] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025272] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025395] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025511] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.025624] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1503.037021] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1503.047614] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1503.058453] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1503.067725] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 76ddefb8-a93f-483a-9487-bc05f5dfef3f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1503.067951] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1503.068107] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1503.083753] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing inventories for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:818}} [ 1503.098967] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating ProviderTree inventory for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:782}} [ 1503.099167] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1503.109735] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing aggregate associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, aggregates: None {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:827}} [ 1503.127585] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing trait associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:839}} [ 1503.282043] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b54a2db-74cf-4142-9897-7d665d68842a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.289816] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b601813-d896-4b98-9ab9-ea9dd526475f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.318450] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f89a3315-2515-43c8-be8c-1ccdfdad1e85 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.325566] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b16593f5-8255-4c7d-9fef-3795eaf979be {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.338276] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1503.346334] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1503.359061] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1503.359230] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.477s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1505.350050] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1506.686098] env[67964]: WARNING oslo_vmware.rw_handles [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1506.686098] env[67964]: ERROR oslo_vmware.rw_handles [ 1506.686819] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1506.688532] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1506.688814] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Copying Virtual Disk [datastore1] vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/0e8799dd-2056-43e1-90ad-32975ea789c6/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1506.689157] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-300ac007-eae6-4bb7-8ebb-9347b145db9b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1506.696351] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Waiting for the task: (returnval){ [ 1506.696351] env[67964]: value = "task-3456837" [ 1506.696351] env[67964]: _type = "Task" [ 1506.696351] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1506.704211] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Task: {'id': task-3456837, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1506.799943] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1506.800151] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1507.206571] env[67964]: DEBUG oslo_vmware.exceptions [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1507.206863] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1507.207488] env[67964]: ERROR nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1507.207488] env[67964]: Faults: ['InvalidArgument'] [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Traceback (most recent call last): [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] yield resources [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self.driver.spawn(context, instance, image_meta, [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self._fetch_image_if_missing(context, vi) [ 1507.207488] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] image_cache(vi, tmp_image_ds_loc) [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] vm_util.copy_virtual_disk( [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] session._wait_for_task(vmdk_copy_task) [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] return self.wait_for_task(task_ref) [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] return evt.wait() [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] result = hub.switch() [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1507.207985] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] return self.greenlet.switch() [ 1507.208578] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1507.208578] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self.f(*self.args, **self.kw) [ 1507.208578] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1507.208578] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] raise exceptions.translate_fault(task_info.error) [ 1507.208578] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1507.208578] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Faults: ['InvalidArgument'] [ 1507.208578] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] [ 1507.208578] env[67964]: INFO nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Terminating instance [ 1507.209948] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1507.209948] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1507.209948] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15b64385-b24d-4c4b-a7ac-f2247ce31db0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.212149] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1507.212336] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1507.213049] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-452d2bea-ad80-4e26-b3d6-91989bfc85fa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.220846] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1507.221070] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2f3abb58-5cde-4f6a-82e1-ad59ddb57a1e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.223189] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1507.223356] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1507.224281] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-12d6ee6a-29b0-4683-82b1-00a4e0ff4ee7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.229139] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Waiting for the task: (returnval){ [ 1507.229139] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5247fd1b-0087-94df-770e-25ae07cc47c1" [ 1507.229139] env[67964]: _type = "Task" [ 1507.229139] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1507.237167] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5247fd1b-0087-94df-770e-25ae07cc47c1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1507.290872] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1507.291104] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1507.291287] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Deleting the datastore file [datastore1] 67eb58c3-a895-4427-9197-3b0c731a123a {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1507.291548] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b2c61c75-ebb2-40ce-9234-874194c788aa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.297951] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Waiting for the task: (returnval){ [ 1507.297951] env[67964]: value = "task-3456839" [ 1507.297951] env[67964]: _type = "Task" [ 1507.297951] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1507.305512] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Task: {'id': task-3456839, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1507.738636] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1507.738935] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Creating directory with path [datastore1] vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1507.739083] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-94028e5a-0c7c-4d0f-a8f2-069527c5c7c2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.750940] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Created directory with path [datastore1] vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1507.751133] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Fetch image to [datastore1] vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1507.751300] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1507.751985] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ba6688-a580-4353-be87-4cbea7ed1a8c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.758234] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4761f6f9-ad21-439b-8250-14611432be86 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.766898] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ca9b5c9-a11a-48e9-b103-d420b5f93c50 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.796729] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d271710-9590-4a7d-b091-7bbd2b63980d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.800280] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1507.804328] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1507.809387] env[67964]: DEBUG oslo_vmware.api [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Task: {'id': task-3456839, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064295} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1507.810781] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1507.810971] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1507.811156] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1507.811325] env[67964]: INFO nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1507.813264] env[67964]: DEBUG nova.compute.claims [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1507.813430] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1507.813636] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1507.816760] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bf7b2bb6-e167-4819-a036-bb51d5931bac {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.839996] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1507.890873] env[67964]: DEBUG oslo_vmware.rw_handles [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1507.951025] env[67964]: DEBUG oslo_vmware.rw_handles [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1507.951228] env[67964]: DEBUG oslo_vmware.rw_handles [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1508.085772] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2290299a-7974-4d3f-980a-9743acef5c6b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.093614] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da58a98a-ac27-48ad-936e-f9cfa3a7e972 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.122494] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e51ae056-72eb-444e-b6ac-1c59d9103933 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.129573] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a04393e1-88bc-473b-8fa0-1bf0b2956039 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.142665] env[67964]: DEBUG nova.compute.provider_tree [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1508.151037] env[67964]: DEBUG nova.scheduler.client.report [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1508.164034] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.350s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1508.164592] env[67964]: ERROR nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1508.164592] env[67964]: Faults: ['InvalidArgument'] [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Traceback (most recent call last): [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self.driver.spawn(context, instance, image_meta, [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self._fetch_image_if_missing(context, vi) [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] image_cache(vi, tmp_image_ds_loc) [ 1508.164592] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] vm_util.copy_virtual_disk( [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] session._wait_for_task(vmdk_copy_task) [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] return self.wait_for_task(task_ref) [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] return evt.wait() [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] result = hub.switch() [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] return self.greenlet.switch() [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1508.164914] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] self.f(*self.args, **self.kw) [ 1508.165254] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1508.165254] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] raise exceptions.translate_fault(task_info.error) [ 1508.165254] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1508.165254] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Faults: ['InvalidArgument'] [ 1508.165254] env[67964]: ERROR nova.compute.manager [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] [ 1508.165396] env[67964]: DEBUG nova.compute.utils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1508.166671] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Build of instance 67eb58c3-a895-4427-9197-3b0c731a123a was re-scheduled: A specified parameter was not correct: fileType [ 1508.166671] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1508.167051] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1508.167223] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1508.167388] env[67964]: DEBUG nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1508.167578] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1508.608976] env[67964]: DEBUG nova.network.neutron [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1508.621870] env[67964]: INFO nova.compute.manager [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Took 0.45 seconds to deallocate network for instance. [ 1508.724080] env[67964]: INFO nova.scheduler.client.report [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Deleted allocations for instance 67eb58c3-a895-4427-9197-3b0c731a123a [ 1508.754735] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8cea451d-5544-4b5c-bebd-1c028503c019 tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "67eb58c3-a895-4427-9197-3b0c731a123a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 613.707s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1508.755891] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "67eb58c3-a895-4427-9197-3b0c731a123a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 417.932s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1508.756137] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Acquiring lock "67eb58c3-a895-4427-9197-3b0c731a123a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1508.756346] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "67eb58c3-a895-4427-9197-3b0c731a123a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1508.756513] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "67eb58c3-a895-4427-9197-3b0c731a123a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1508.758461] env[67964]: INFO nova.compute.manager [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Terminating instance [ 1508.760485] env[67964]: DEBUG nova.compute.manager [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1508.760706] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1508.763710] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b1bac096-f174-43bc-bbb8-6d6b8c81c422 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.767808] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1508.775860] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79791fb2-f233-4ecd-bc70-6021dfe63871 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.804362] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1508.804362] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 67eb58c3-a895-4427-9197-3b0c731a123a could not be found. [ 1508.804362] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1508.804539] env[67964]: INFO nova.compute.manager [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1508.804685] env[67964]: DEBUG oslo.service.loopingcall [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1508.804930] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1508.805122] env[67964]: DEBUG nova.compute.manager [-] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1508.805219] env[67964]: DEBUG nova.network.neutron [-] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1508.824680] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1508.824934] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1508.826391] env[67964]: INFO nova.compute.claims [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1508.842336] env[67964]: DEBUG nova.network.neutron [-] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1508.855604] env[67964]: INFO nova.compute.manager [-] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] Took 0.05 seconds to deallocate network for instance. [ 1508.939671] env[67964]: DEBUG oslo_concurrency.lockutils [None req-5d58ebb5-b5e4-4820-ab64-d45d21805e6c tempest-ServerMetadataTestJSON-1940429130 tempest-ServerMetadataTestJSON-1940429130-project-member] Lock "67eb58c3-a895-4427-9197-3b0c731a123a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1508.940560] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "67eb58c3-a895-4427-9197-3b0c731a123a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 349.221s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1508.940747] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 67eb58c3-a895-4427-9197-3b0c731a123a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1508.940934] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "67eb58c3-a895-4427-9197-3b0c731a123a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.026267] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b82623a-f9c9-4708-9eb3-e54751cb2ff2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.034181] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fff0eb4d-6882-4d25-a265-0eb9754f0bc2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.064718] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd14eb92-62a6-41ce-9a0a-f122cd142713 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.071690] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a8ead08-fd76-456a-897d-84abad9ca4c4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.085811] env[67964]: DEBUG nova.compute.provider_tree [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1509.094484] env[67964]: DEBUG nova.scheduler.client.report [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1509.108841] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.109841] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1509.142938] env[67964]: DEBUG nova.compute.utils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1509.146318] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1509.146318] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1509.157976] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1509.224427] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1509.227917] env[67964]: DEBUG nova.policy [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'daaca12089eb4485b5607a9d577f33b2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '83336cd0155c4286b66ac327ef1385b5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1509.251930] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1509.252198] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1509.252355] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1509.252532] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1509.252680] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1509.252825] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1509.253037] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1509.253199] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1509.253364] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1509.253524] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1509.253696] env[67964]: DEBUG nova.virt.hardware [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1509.254626] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69024ad9-12f0-4141-9fd6-85579294450d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.262517] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59608fc3-6327-4457-aba5-40890d2feb25 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.554877] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Successfully created port: b171162e-4c87-47ce-8982-224ed0a892cd {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1510.380117] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Successfully updated port: b171162e-4c87-47ce-8982-224ed0a892cd {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1510.394195] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "refresh_cache-da8f11e2-6d58-4e28-aabb-9943bc657e60" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1510.395206] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "refresh_cache-da8f11e2-6d58-4e28-aabb-9943bc657e60" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1510.395206] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1510.460708] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1510.656499] env[67964]: DEBUG nova.compute.manager [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Received event network-vif-plugged-b171162e-4c87-47ce-8982-224ed0a892cd {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1510.656661] env[67964]: DEBUG oslo_concurrency.lockutils [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] Acquiring lock "da8f11e2-6d58-4e28-aabb-9943bc657e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1510.656877] env[67964]: DEBUG oslo_concurrency.lockutils [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1510.657048] env[67964]: DEBUG oslo_concurrency.lockutils [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1510.657217] env[67964]: DEBUG nova.compute.manager [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] No waiting events found dispatching network-vif-plugged-b171162e-4c87-47ce-8982-224ed0a892cd {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1510.657395] env[67964]: WARNING nova.compute.manager [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Received unexpected event network-vif-plugged-b171162e-4c87-47ce-8982-224ed0a892cd for instance with vm_state building and task_state spawning. [ 1510.657572] env[67964]: DEBUG nova.compute.manager [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Received event network-changed-b171162e-4c87-47ce-8982-224ed0a892cd {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1510.657726] env[67964]: DEBUG nova.compute.manager [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Refreshing instance network info cache due to event network-changed-b171162e-4c87-47ce-8982-224ed0a892cd. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1510.657891] env[67964]: DEBUG oslo_concurrency.lockutils [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] Acquiring lock "refresh_cache-da8f11e2-6d58-4e28-aabb-9943bc657e60" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1510.658840] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Updating instance_info_cache with network_info: [{"id": "b171162e-4c87-47ce-8982-224ed0a892cd", "address": "fa:16:3e:95:63:83", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb171162e-4c", "ovs_interfaceid": "b171162e-4c87-47ce-8982-224ed0a892cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1510.672526] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "refresh_cache-da8f11e2-6d58-4e28-aabb-9943bc657e60" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1510.672808] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Instance network_info: |[{"id": "b171162e-4c87-47ce-8982-224ed0a892cd", "address": "fa:16:3e:95:63:83", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb171162e-4c", "ovs_interfaceid": "b171162e-4c87-47ce-8982-224ed0a892cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1510.673088] env[67964]: DEBUG oslo_concurrency.lockutils [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] Acquired lock "refresh_cache-da8f11e2-6d58-4e28-aabb-9943bc657e60" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1510.673265] env[67964]: DEBUG nova.network.neutron [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Refreshing network info cache for port b171162e-4c87-47ce-8982-224ed0a892cd {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1510.674227] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:95:63:83', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '50886eea-591a-452c-a27b-5f22cfc9df85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b171162e-4c87-47ce-8982-224ed0a892cd', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1510.682039] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating folder: Project (83336cd0155c4286b66ac327ef1385b5). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1510.683074] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4619615e-3f64-4218-ad16-9c088815918f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.696644] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Created folder: Project (83336cd0155c4286b66ac327ef1385b5) in parent group-v690366. [ 1510.696644] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating folder: Instances. Parent ref: group-v690460. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1510.696644] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c25818cd-201f-4b4e-8d63-d76308a62e18 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.704619] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Created folder: Instances in parent group-v690460. [ 1510.704828] env[67964]: DEBUG oslo.service.loopingcall [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1510.704997] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1510.705190] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ff0df3d9-f2c5-469d-92e7-c4a10d607718 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.727778] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1510.727778] env[67964]: value = "task-3456842" [ 1510.727778] env[67964]: _type = "Task" [ 1510.727778] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1510.735015] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456842, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1510.936637] env[67964]: DEBUG nova.network.neutron [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Updated VIF entry in instance network info cache for port b171162e-4c87-47ce-8982-224ed0a892cd. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1510.937024] env[67964]: DEBUG nova.network.neutron [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Updating instance_info_cache with network_info: [{"id": "b171162e-4c87-47ce-8982-224ed0a892cd", "address": "fa:16:3e:95:63:83", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb171162e-4c", "ovs_interfaceid": "b171162e-4c87-47ce-8982-224ed0a892cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1510.949173] env[67964]: DEBUG oslo_concurrency.lockutils [req-c165c712-a8c1-43a7-847d-cf8fe783ea78 req-327f92e9-d6ca-4b72-a341-46a75a785723 service nova] Releasing lock "refresh_cache-da8f11e2-6d58-4e28-aabb-9943bc657e60" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1511.237991] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456842, 'name': CreateVM_Task, 'duration_secs': 0.300043} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1511.238185] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1511.238852] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1511.239023] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1511.239356] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1511.239625] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-15628ed3-5b66-433a-8d9b-cfad91c1f1e3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.243856] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 1511.243856] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]526f3703-d496-9869-0078-e475a44b3d33" [ 1511.243856] env[67964]: _type = "Task" [ 1511.243856] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1511.251438] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]526f3703-d496-9869-0078-e475a44b3d33, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1511.754397] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1511.754675] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1511.754946] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1511.803764] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1511.803764] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1511.803959] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1511.824831] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.824998] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.825295] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.825436] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.825565] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18d6df82-a19a-499a-8874-171218569651] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.825690] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.825810] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.825928] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.826056] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.826175] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1511.826291] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1511.827048] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1519.818471] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1556.703399] env[67964]: WARNING oslo_vmware.rw_handles [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1556.703399] env[67964]: ERROR oslo_vmware.rw_handles [ 1556.704174] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1556.705824] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1556.706112] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Copying Virtual Disk [datastore1] vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/0c965ab6-594e-46ea-ad45-722847752a2b/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1556.706438] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1a1e73fb-b317-4cf0-a5c4-ded626dd9c52 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1556.714986] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Waiting for the task: (returnval){ [ 1556.714986] env[67964]: value = "task-3456843" [ 1556.714986] env[67964]: _type = "Task" [ 1556.714986] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1556.723733] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Task: {'id': task-3456843, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1557.225928] env[67964]: DEBUG oslo_vmware.exceptions [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1557.225928] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1557.225928] env[67964]: ERROR nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1557.225928] env[67964]: Faults: ['InvalidArgument'] [ 1557.225928] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Traceback (most recent call last): [ 1557.225928] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1557.225928] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] yield resources [ 1557.225928] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1557.225928] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self.driver.spawn(context, instance, image_meta, [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self._fetch_image_if_missing(context, vi) [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] image_cache(vi, tmp_image_ds_loc) [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] vm_util.copy_virtual_disk( [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] session._wait_for_task(vmdk_copy_task) [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] return self.wait_for_task(task_ref) [ 1557.226221] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] return evt.wait() [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] result = hub.switch() [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] return self.greenlet.switch() [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self.f(*self.args, **self.kw) [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] raise exceptions.translate_fault(task_info.error) [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Faults: ['InvalidArgument'] [ 1557.226555] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] [ 1557.226873] env[67964]: INFO nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Terminating instance [ 1557.227503] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1557.227716] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1557.227989] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ef5b5a3e-18a9-495f-b09d-2ec4949ad3ac {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.231576] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1557.231774] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1557.232528] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5753d5b9-de72-407a-9180-e593d0628952 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.236229] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1557.236403] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1557.237381] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f1e60de-296e-48c6-87c3-f3d7bba77896 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.241929] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1557.243104] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ebdd9ac4-80b8-4fdd-91b8-2b6a260ee35d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.244513] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Waiting for the task: (returnval){ [ 1557.244513] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]527704c3-8230-2dd5-b84d-e906a62ecdce" [ 1557.244513] env[67964]: _type = "Task" [ 1557.244513] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1557.253361] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]527704c3-8230-2dd5-b84d-e906a62ecdce, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1557.506042] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1557.506283] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1557.506474] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Deleting the datastore file [datastore1] 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1557.506744] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-54dee27f-03ea-48d9-929c-d9b6c110878d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.513119] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Waiting for the task: (returnval){ [ 1557.513119] env[67964]: value = "task-3456845" [ 1557.513119] env[67964]: _type = "Task" [ 1557.513119] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1557.520493] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Task: {'id': task-3456845, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1557.755936] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1557.755936] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Creating directory with path [datastore1] vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1557.755936] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-32000e84-447e-43ea-8a3c-d2a9eca976e4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.768187] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Created directory with path [datastore1] vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1557.768382] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Fetch image to [datastore1] vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1557.768544] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1557.769286] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b2aad07-3906-4ae4-a0e4-0e7a38b311d7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.776070] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-259e2eb0-5765-49c9-b8c1-7ccf4567cedd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.785248] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26c2049-601d-4350-9f26-9e22c4f8c028 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.816353] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c3728a8-344a-4c49-ac06-6b1b2e51031c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.821982] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7a4f40dd-d840-4e83-bfdf-bf05d8e1a2e0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.840863] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1557.888642] env[67964]: DEBUG oslo_vmware.rw_handles [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1557.947576] env[67964]: DEBUG oslo_vmware.rw_handles [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1557.947778] env[67964]: DEBUG oslo_vmware.rw_handles [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1558.023096] env[67964]: DEBUG oslo_vmware.api [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Task: {'id': task-3456845, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065838} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1558.023353] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1558.023534] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1558.023704] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1558.023872] env[67964]: INFO nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Took 0.79 seconds to destroy the instance on the hypervisor. [ 1558.026049] env[67964]: DEBUG nova.compute.claims [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1558.026233] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1558.026441] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1558.215193] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52feba5f-b7d3-40cd-ba87-2e94344ba974 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.222699] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc1fefe3-995b-4373-9faf-b9656ef7985c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.252584] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-564f1563-c6de-4b1e-887a-42cd5afd1e44 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.259689] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51db82cd-acbd-4033-a25f-e97bbf6eee85 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.272686] env[67964]: DEBUG nova.compute.provider_tree [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1558.282011] env[67964]: DEBUG nova.scheduler.client.report [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1558.299516] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.273s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1558.300120] env[67964]: ERROR nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1558.300120] env[67964]: Faults: ['InvalidArgument'] [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Traceback (most recent call last): [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self.driver.spawn(context, instance, image_meta, [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self._fetch_image_if_missing(context, vi) [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] image_cache(vi, tmp_image_ds_loc) [ 1558.300120] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] vm_util.copy_virtual_disk( [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] session._wait_for_task(vmdk_copy_task) [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] return self.wait_for_task(task_ref) [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] return evt.wait() [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] result = hub.switch() [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] return self.greenlet.switch() [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1558.300509] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] self.f(*self.args, **self.kw) [ 1558.301028] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1558.301028] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] raise exceptions.translate_fault(task_info.error) [ 1558.301028] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1558.301028] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Faults: ['InvalidArgument'] [ 1558.301028] env[67964]: ERROR nova.compute.manager [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] [ 1558.301028] env[67964]: DEBUG nova.compute.utils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1558.302341] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Build of instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 was re-scheduled: A specified parameter was not correct: fileType [ 1558.302341] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1558.302715] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1558.302885] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1558.303065] env[67964]: DEBUG nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1558.303321] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1558.753369] env[67964]: DEBUG nova.network.neutron [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1558.768871] env[67964]: INFO nova.compute.manager [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Took 0.47 seconds to deallocate network for instance. [ 1558.862097] env[67964]: INFO nova.scheduler.client.report [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Deleted allocations for instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 [ 1558.883142] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bddc7e6c-22c6-4d55-90bd-a804a2aad93c tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 622.304s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1558.884157] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.804s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1558.884432] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Acquiring lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1558.884587] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1558.884765] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1558.887213] env[67964]: INFO nova.compute.manager [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Terminating instance [ 1558.888874] env[67964]: DEBUG nova.compute.manager [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1558.889081] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1558.889891] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-97dfa2df-4390-45ba-a6df-f369f36f308f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.894936] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1558.902585] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69753a59-1bba-4598-9be3-c9a1987e5279 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.934984] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75 could not be found. [ 1558.935213] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1558.935394] env[67964]: INFO nova.compute.manager [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1558.935636] env[67964]: DEBUG oslo.service.loopingcall [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1558.940009] env[67964]: DEBUG nova.compute.manager [-] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1558.940104] env[67964]: DEBUG nova.network.neutron [-] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1558.951585] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1558.951810] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1558.953214] env[67964]: INFO nova.compute.claims [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1558.962853] env[67964]: DEBUG nova.network.neutron [-] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1558.970868] env[67964]: INFO nova.compute.manager [-] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] Took 0.03 seconds to deallocate network for instance. [ 1559.065145] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cefeac35-8a93-44cd-ba0d-78c151cc4af1 tempest-ServersAdminTestJSON-1523959553 tempest-ServersAdminTestJSON-1523959553-project-member] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.066179] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 399.347s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1559.066179] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 09cf2e6c-10e7-4017-9f67-ff2a3b9fac75] During sync_power_state the instance has a pending task (deleting). Skip. [ 1559.066355] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "09cf2e6c-10e7-4017-9f67-ff2a3b9fac75" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.138754] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fa95682-73fa-4e2a-b9de-b4b4d989b963 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.146601] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dd1b5db-3b51-4d51-8f27-e13ee8b08049 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.179712] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e705f642-5f60-436a-8905-ecb90a99cdc8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.188794] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3816398-786e-4608-b4fa-f647d186a63d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.204761] env[67964]: DEBUG nova.compute.provider_tree [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1559.219043] env[67964]: DEBUG nova.scheduler.client.report [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1559.232651] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.233138] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1559.267311] env[67964]: DEBUG nova.compute.utils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1559.268816] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1559.268995] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1559.279075] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1559.344160] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1559.352421] env[67964]: DEBUG nova.policy [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4582636e1ee74b61878e4c1badbd563e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15502e37757142d4afa0577a3e80bfb8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1559.371399] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1559.371642] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1559.371823] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1559.372184] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1559.372372] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1559.372465] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1559.372676] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1559.372835] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1559.373015] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1559.373191] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1559.373373] env[67964]: DEBUG nova.virt.hardware [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1559.374522] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2de3d4e1-d100-4603-beb8-6ce7a0d36bd6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.382491] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb6887c-c661-4c75-8bf4-0bfbc314da61 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.689492] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Successfully created port: 13005eab-2ef6-4ab8-a17a-0e0b8ef8b594 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1560.313520] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Successfully updated port: 13005eab-2ef6-4ab8-a17a-0e0b8ef8b594 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1560.326778] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "refresh_cache-2c06844d-2c7f-4e27-b3c6-16dfd6047119" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1560.326929] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "refresh_cache-2c06844d-2c7f-4e27-b3c6-16dfd6047119" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1560.327091] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1560.365134] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1560.596741] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Updating instance_info_cache with network_info: [{"id": "13005eab-2ef6-4ab8-a17a-0e0b8ef8b594", "address": "fa:16:3e:a3:d3:74", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13005eab-2e", "ovs_interfaceid": "13005eab-2ef6-4ab8-a17a-0e0b8ef8b594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1560.611477] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "refresh_cache-2c06844d-2c7f-4e27-b3c6-16dfd6047119" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1560.611774] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Instance network_info: |[{"id": "13005eab-2ef6-4ab8-a17a-0e0b8ef8b594", "address": "fa:16:3e:a3:d3:74", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13005eab-2e", "ovs_interfaceid": "13005eab-2ef6-4ab8-a17a-0e0b8ef8b594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1560.612175] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a3:d3:74', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b4d548e7-d762-406a-bb2d-dc7168a8ca67', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '13005eab-2ef6-4ab8-a17a-0e0b8ef8b594', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1560.619585] env[67964]: DEBUG oslo.service.loopingcall [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1560.620137] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1560.620383] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d9acda34-f344-420c-a4d2-ac99f5f9cd7a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.640679] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1560.640679] env[67964]: value = "task-3456846" [ 1560.640679] env[67964]: _type = "Task" [ 1560.640679] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1560.648138] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456846, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1560.792581] env[67964]: DEBUG nova.compute.manager [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Received event network-vif-plugged-13005eab-2ef6-4ab8-a17a-0e0b8ef8b594 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1560.792769] env[67964]: DEBUG oslo_concurrency.lockutils [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] Acquiring lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1560.792979] env[67964]: DEBUG oslo_concurrency.lockutils [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1560.793163] env[67964]: DEBUG oslo_concurrency.lockutils [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.793406] env[67964]: DEBUG nova.compute.manager [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] No waiting events found dispatching network-vif-plugged-13005eab-2ef6-4ab8-a17a-0e0b8ef8b594 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1560.793655] env[67964]: WARNING nova.compute.manager [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Received unexpected event network-vif-plugged-13005eab-2ef6-4ab8-a17a-0e0b8ef8b594 for instance with vm_state building and task_state spawning. [ 1560.793842] env[67964]: DEBUG nova.compute.manager [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Received event network-changed-13005eab-2ef6-4ab8-a17a-0e0b8ef8b594 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1560.794010] env[67964]: DEBUG nova.compute.manager [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Refreshing instance network info cache due to event network-changed-13005eab-2ef6-4ab8-a17a-0e0b8ef8b594. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1560.794209] env[67964]: DEBUG oslo_concurrency.lockutils [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] Acquiring lock "refresh_cache-2c06844d-2c7f-4e27-b3c6-16dfd6047119" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1560.794346] env[67964]: DEBUG oslo_concurrency.lockutils [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] Acquired lock "refresh_cache-2c06844d-2c7f-4e27-b3c6-16dfd6047119" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1560.794502] env[67964]: DEBUG nova.network.neutron [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Refreshing network info cache for port 13005eab-2ef6-4ab8-a17a-0e0b8ef8b594 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1561.150436] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456846, 'name': CreateVM_Task, 'duration_secs': 0.308091} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1561.150615] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1561.151312] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1561.151480] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1561.151790] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1561.152049] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1072cb8f-bd9c-4365-9ab7-20ae18f03b37 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.156233] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 1561.156233] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52eee69a-bb18-3baf-3332-1c7b68c3aed2" [ 1561.156233] env[67964]: _type = "Task" [ 1561.156233] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1561.165687] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52eee69a-bb18-3baf-3332-1c7b68c3aed2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1561.251741] env[67964]: DEBUG nova.network.neutron [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Updated VIF entry in instance network info cache for port 13005eab-2ef6-4ab8-a17a-0e0b8ef8b594. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1561.252118] env[67964]: DEBUG nova.network.neutron [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Updating instance_info_cache with network_info: [{"id": "13005eab-2ef6-4ab8-a17a-0e0b8ef8b594", "address": "fa:16:3e:a3:d3:74", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13005eab-2e", "ovs_interfaceid": "13005eab-2ef6-4ab8-a17a-0e0b8ef8b594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1561.261288] env[67964]: DEBUG oslo_concurrency.lockutils [req-f346ac54-4e34-429a-8b20-e3d8c27b9e06 req-d1b68885-8737-4138-9dd8-6e55d6c992f4 service nova] Releasing lock "refresh_cache-2c06844d-2c7f-4e27-b3c6-16dfd6047119" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1561.667689] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1561.667990] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1561.667990] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1564.802059] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1564.812966] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1564.813272] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1564.813509] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1564.813724] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1564.815213] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6526d39-b93b-4c40-a3ab-247a33434498 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.826341] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8840b65-c9c2-4121-8f90-3f73348aad5d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.842185] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8793526-98bc-4feb-a7b9-dfccf45c2750 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.848117] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e8d03f4-5a0b-471c-a14c-aeba66b68235 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.875742] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180894MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1564.875884] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1564.876081] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1564.944720] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.944889] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945010] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945146] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945264] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945381] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945498] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945613] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945727] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.945839] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1564.955932] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1564.967279] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 76ddefb8-a93f-483a-9487-bc05f5dfef3f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1564.967515] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1564.967666] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1565.109998] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0027aa73-ba35-44eb-ae69-77c071b32bdb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.117903] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33f117b0-c3b8-483a-b167-3a32e46d9839 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.147177] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de67ee3c-b7b3-454f-a94f-1351bc528c34 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.154387] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-020994e9-481a-4088-9d59-4aa7d59b5aba {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.166996] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1565.175402] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1565.188238] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1565.189027] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.312s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1566.187784] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1567.800567] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1567.800909] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1569.795889] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1569.800845] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1569.800845] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1569.800845] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1571.801178] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1571.801488] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1571.801488] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1571.824027] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824027] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824027] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824027] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18d6df82-a19a-499a-8874-171218569651] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824027] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824275] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824275] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824275] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824275] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824275] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1571.824444] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1571.824444] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1576.522413] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "ea5f3d40-6494-459a-a917-2602d0718d8c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1589.795037] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "430cad73-6b2c-4702-96a0-672f5b4c219f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1589.795366] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1591.582119] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "bc98edf7-889e-4814-b859-d860033ba0cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1591.582443] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "bc98edf7-889e-4814-b859-d860033ba0cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1594.256873] env[67964]: DEBUG oslo_concurrency.lockutils [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.885732] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1604.879159] env[67964]: WARNING oslo_vmware.rw_handles [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1604.879159] env[67964]: ERROR oslo_vmware.rw_handles [ 1604.879867] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1604.881456] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1604.881703] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Copying Virtual Disk [datastore1] vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/f76a8c1f-cad3-460a-a6f4-3445ac3223c1/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1604.881980] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-87253317-c7f2-4c8f-8e5c-59b7c22cb28f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1604.889554] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Waiting for the task: (returnval){ [ 1604.889554] env[67964]: value = "task-3456847" [ 1604.889554] env[67964]: _type = "Task" [ 1604.889554] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1604.897395] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Task: {'id': task-3456847, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1605.400141] env[67964]: DEBUG oslo_vmware.exceptions [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1605.400452] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1605.401011] env[67964]: ERROR nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1605.401011] env[67964]: Faults: ['InvalidArgument'] [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Traceback (most recent call last): [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] yield resources [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self.driver.spawn(context, instance, image_meta, [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self._fetch_image_if_missing(context, vi) [ 1605.401011] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] image_cache(vi, tmp_image_ds_loc) [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] vm_util.copy_virtual_disk( [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] session._wait_for_task(vmdk_copy_task) [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] return self.wait_for_task(task_ref) [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] return evt.wait() [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] result = hub.switch() [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1605.401419] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] return self.greenlet.switch() [ 1605.401962] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1605.401962] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self.f(*self.args, **self.kw) [ 1605.401962] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1605.401962] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] raise exceptions.translate_fault(task_info.error) [ 1605.401962] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1605.401962] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Faults: ['InvalidArgument'] [ 1605.401962] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] [ 1605.401962] env[67964]: INFO nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Terminating instance [ 1605.402949] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1605.403140] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1605.403383] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d1b95e4b-2203-4809-935d-39efb9454dd8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.405756] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1605.405949] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1605.406671] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-273a8d89-0aea-4975-88fd-64cea0831e98 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.413581] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1605.413799] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9d85c75f-c6a7-4ce7-893b-a27405d056d2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.415911] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1605.416097] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1605.417064] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-814066c0-c4cd-4bec-a018-84ab4380b27d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.424033] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 1605.424033] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]524c6ba2-8e98-126f-e3b2-b6024b2e08d5" [ 1605.424033] env[67964]: _type = "Task" [ 1605.424033] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1605.437887] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1605.438206] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating directory with path [datastore1] vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1605.438446] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-51f70465-004c-4510-9539-7a25b59181de {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.460221] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created directory with path [datastore1] vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1605.461028] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Fetch image to [datastore1] vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1605.461028] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1605.461388] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-674717a9-2a04-46b1-b24c-f0ad44dfb822 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.468332] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c11e4a1-2b33-4bee-9a0e-76382a2efa65 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.477359] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6f759f7-c518-4721-ac4b-2db948a9e8be {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.508499] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90e9daa1-cb1a-4e74-b988-2fb7f328bd7d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.510970] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1605.511178] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1605.511354] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Deleting the datastore file [datastore1] d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1605.511581] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8a9c25f3-e2e3-43c2-b1b5-387b08545928 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.517097] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f5b454f9-0f02-4be6-86af-7bc9f1fe5ecc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.518739] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Waiting for the task: (returnval){ [ 1605.518739] env[67964]: value = "task-3456849" [ 1605.518739] env[67964]: _type = "Task" [ 1605.518739] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1605.526211] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Task: {'id': task-3456849, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1605.539708] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1605.591131] env[67964]: DEBUG oslo_vmware.rw_handles [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1605.650430] env[67964]: DEBUG oslo_vmware.rw_handles [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1605.650609] env[67964]: DEBUG oslo_vmware.rw_handles [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1606.029434] env[67964]: DEBUG oslo_vmware.api [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Task: {'id': task-3456849, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069725} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1606.029821] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1606.029866] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1606.030020] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1606.030197] env[67964]: INFO nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1606.032297] env[67964]: DEBUG nova.compute.claims [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1606.032474] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1606.032683] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1606.220333] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32277bec-5512-4df8-96bd-850ad3b340a1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.229297] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64c8bdad-0410-433d-b2c6-9719f2168f58 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.259364] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83825d7f-7c47-4963-a716-efc9caddf8a5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.267182] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f74e990b-77b8-4bd4-a663-e83e7e3cd901 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.280170] env[67964]: DEBUG nova.compute.provider_tree [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1606.288792] env[67964]: DEBUG nova.scheduler.client.report [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1606.302452] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1606.303009] env[67964]: ERROR nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1606.303009] env[67964]: Faults: ['InvalidArgument'] [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Traceback (most recent call last): [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self.driver.spawn(context, instance, image_meta, [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self._fetch_image_if_missing(context, vi) [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] image_cache(vi, tmp_image_ds_loc) [ 1606.303009] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] vm_util.copy_virtual_disk( [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] session._wait_for_task(vmdk_copy_task) [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] return self.wait_for_task(task_ref) [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] return evt.wait() [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] result = hub.switch() [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] return self.greenlet.switch() [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1606.303397] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] self.f(*self.args, **self.kw) [ 1606.303707] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1606.303707] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] raise exceptions.translate_fault(task_info.error) [ 1606.303707] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1606.303707] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Faults: ['InvalidArgument'] [ 1606.303707] env[67964]: ERROR nova.compute.manager [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] [ 1606.303707] env[67964]: DEBUG nova.compute.utils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1606.305156] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Build of instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 was re-scheduled: A specified parameter was not correct: fileType [ 1606.305156] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1606.305568] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1606.305743] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1606.305913] env[67964]: DEBUG nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1606.306103] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1606.748074] env[67964]: DEBUG nova.network.neutron [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1606.760563] env[67964]: INFO nova.compute.manager [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Took 0.45 seconds to deallocate network for instance. [ 1606.852144] env[67964]: INFO nova.scheduler.client.report [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Deleted allocations for instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 [ 1606.878881] env[67964]: DEBUG oslo_concurrency.lockutils [None req-593fb918-60a0-4c21-bd2b-b3aeb5540151 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 621.068s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1606.880049] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 425.528s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1606.880691] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Acquiring lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1606.880691] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1606.880691] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1606.882740] env[67964]: INFO nova.compute.manager [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Terminating instance [ 1606.884379] env[67964]: DEBUG nova.compute.manager [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1606.884585] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1606.885037] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-140eb2c1-dfff-4ca5-ae9c-9012a36d81d5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.894111] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-739b5a6c-0803-4824-b0f8-467be3413b4e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.905961] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1606.926416] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d9dcb5d4-e8a3-4d4d-af94-1bde87121c08 could not be found. [ 1606.926616] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1606.926788] env[67964]: INFO nova.compute.manager [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1606.927035] env[67964]: DEBUG oslo.service.loopingcall [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1606.927271] env[67964]: DEBUG nova.compute.manager [-] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1606.927366] env[67964]: DEBUG nova.network.neutron [-] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1606.951536] env[67964]: DEBUG nova.network.neutron [-] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1606.960640] env[67964]: INFO nova.compute.manager [-] [instance: d9dcb5d4-e8a3-4d4d-af94-1bde87121c08] Took 0.03 seconds to deallocate network for instance. [ 1606.969889] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1606.970132] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1606.971541] env[67964]: INFO nova.compute.claims [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1607.051610] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a6c24c3f-f60d-439a-b6fc-935a696b44f8 tempest-ServerActionsTestJSON-643117996 tempest-ServerActionsTestJSON-643117996-project-member] Lock "d9dcb5d4-e8a3-4d4d-af94-1bde87121c08" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.157633] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-656f2f94-a18d-47fe-8bf5-a7f2d3a3180e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.165309] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-919461d7-8f72-421d-8ffe-da9faca921cf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.193990] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a83ef979-43b1-4807-b48a-feac3b73f19d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.200696] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9786fd9-28b8-4933-b461-c28a76b0f1ce {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.214364] env[67964]: DEBUG nova.compute.provider_tree [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1607.223490] env[67964]: DEBUG nova.scheduler.client.report [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1607.235937] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.236431] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1607.265837] env[67964]: DEBUG nova.compute.utils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1607.267259] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1607.267444] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1607.275519] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1607.337325] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1607.364249] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1607.364596] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1607.364727] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1607.364840] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1607.364981] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1607.365149] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1607.365358] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1607.365515] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1607.365731] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1607.366023] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1607.366231] env[67964]: DEBUG nova.virt.hardware [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1607.367112] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e66cae61-fafd-4e12-b071-ba12ecf629c9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.375234] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59e06770-3910-442c-88bb-738924d6f4fb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.535522] env[67964]: DEBUG nova.policy [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7373f7b862cc4f43a074101da040ac07', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30050a5e509146ea87e6a86263ba0f59', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1607.884700] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Successfully created port: a7757de9-271d-482f-b9b4-8bed7b1f14e0 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1608.511106] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Successfully updated port: a7757de9-271d-482f-b9b4-8bed7b1f14e0 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1608.527968] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "refresh_cache-41d93bf8-7991-4b52-8ebb-a1988dc627c1" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1608.528146] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "refresh_cache-41d93bf8-7991-4b52-8ebb-a1988dc627c1" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1608.528377] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1608.581310] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1608.763480] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Updating instance_info_cache with network_info: [{"id": "a7757de9-271d-482f-b9b4-8bed7b1f14e0", "address": "fa:16:3e:9f:8c:df", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7757de9-27", "ovs_interfaceid": "a7757de9-271d-482f-b9b4-8bed7b1f14e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1608.777490] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "refresh_cache-41d93bf8-7991-4b52-8ebb-a1988dc627c1" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1608.777784] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Instance network_info: |[{"id": "a7757de9-271d-482f-b9b4-8bed7b1f14e0", "address": "fa:16:3e:9f:8c:df", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7757de9-27", "ovs_interfaceid": "a7757de9-271d-482f-b9b4-8bed7b1f14e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1608.778191] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9f:8c:df', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b107fab-ee71-47db-ad4d-3c6f05546843', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a7757de9-271d-482f-b9b4-8bed7b1f14e0', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1608.786167] env[67964]: DEBUG oslo.service.loopingcall [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1608.786646] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1608.786879] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4616e00c-bf45-47bd-b171-a11de9a9e2da {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.806467] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1608.806467] env[67964]: value = "task-3456850" [ 1608.806467] env[67964]: _type = "Task" [ 1608.806467] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1608.808435] env[67964]: DEBUG nova.compute.manager [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Received event network-vif-plugged-a7757de9-271d-482f-b9b4-8bed7b1f14e0 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1608.808633] env[67964]: DEBUG oslo_concurrency.lockutils [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] Acquiring lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1608.808831] env[67964]: DEBUG oslo_concurrency.lockutils [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1608.808995] env[67964]: DEBUG oslo_concurrency.lockutils [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1608.809171] env[67964]: DEBUG nova.compute.manager [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] No waiting events found dispatching network-vif-plugged-a7757de9-271d-482f-b9b4-8bed7b1f14e0 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1608.809342] env[67964]: WARNING nova.compute.manager [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Received unexpected event network-vif-plugged-a7757de9-271d-482f-b9b4-8bed7b1f14e0 for instance with vm_state building and task_state spawning. [ 1608.809490] env[67964]: DEBUG nova.compute.manager [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Received event network-changed-a7757de9-271d-482f-b9b4-8bed7b1f14e0 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1608.809638] env[67964]: DEBUG nova.compute.manager [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Refreshing instance network info cache due to event network-changed-a7757de9-271d-482f-b9b4-8bed7b1f14e0. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1608.809813] env[67964]: DEBUG oslo_concurrency.lockutils [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] Acquiring lock "refresh_cache-41d93bf8-7991-4b52-8ebb-a1988dc627c1" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1608.809949] env[67964]: DEBUG oslo_concurrency.lockutils [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] Acquired lock "refresh_cache-41d93bf8-7991-4b52-8ebb-a1988dc627c1" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1608.810115] env[67964]: DEBUG nova.network.neutron [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Refreshing network info cache for port a7757de9-271d-482f-b9b4-8bed7b1f14e0 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1608.819613] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456850, 'name': CreateVM_Task} progress is 5%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1609.151539] env[67964]: DEBUG nova.network.neutron [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Updated VIF entry in instance network info cache for port a7757de9-271d-482f-b9b4-8bed7b1f14e0. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1609.151918] env[67964]: DEBUG nova.network.neutron [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Updating instance_info_cache with network_info: [{"id": "a7757de9-271d-482f-b9b4-8bed7b1f14e0", "address": "fa:16:3e:9f:8c:df", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7757de9-27", "ovs_interfaceid": "a7757de9-271d-482f-b9b4-8bed7b1f14e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1609.161269] env[67964]: DEBUG oslo_concurrency.lockutils [req-8de33ff7-1e06-40bd-a612-54629a33cf64 req-3ed498ee-fded-4548-9eee-8cb2d2aef47f service nova] Releasing lock "refresh_cache-41d93bf8-7991-4b52-8ebb-a1988dc627c1" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1609.318405] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456850, 'name': CreateVM_Task, 'duration_secs': 0.295058} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1609.318593] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1609.319229] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1609.319390] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1609.319710] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1609.319951] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-12d7eb66-20d9-4b6a-ad24-4f9eb2a50356 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.324014] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 1609.324014] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52abd7de-2c7e-ab01-5a2f-e6b2fd97e752" [ 1609.324014] env[67964]: _type = "Task" [ 1609.324014] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1609.332016] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52abd7de-2c7e-ab01-5a2f-e6b2fd97e752, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1609.834669] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1609.834960] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1609.835160] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1613.150752] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1624.800445] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1624.810930] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1624.811179] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1624.811348] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1624.811508] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1624.812696] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12f8f40f-1b2a-4070-b138-7756336f0cca {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.821508] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86850a44-55bc-4466-bdec-28ad07d12477 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.834947] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cf3c112-233a-44c3-9cf6-83ba8dd9fdb4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.841039] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38e9d694-30ce-40bc-b85a-05dd2ab5b130 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.870615] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180932MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1624.870762] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1624.870956] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1624.941821] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 9cd7ef82-147a-4303-a773-32b161f819ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.941981] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942124] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942248] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942367] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942483] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942596] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942722] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942835] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.942976] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1624.953756] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 76ddefb8-a93f-483a-9487-bc05f5dfef3f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1624.964160] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1624.976456] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1624.976674] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1624.976859] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1625.125587] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58bb691c-038c-4cb1-b0ff-aff51962ee15 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.132920] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4c1ef11-9235-42d3-9ec7-989e46ecc8cb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.161206] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-569d690e-f4f7-4b68-8522-42a5c4a7cf5c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.167963] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5770044f-40f5-4416-9ed1-998208002a54 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.181473] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1625.189381] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1625.203691] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1625.203903] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.333s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.806838] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "c01bc11b-384e-418e-be43-e12d0a845a24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1626.807158] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "c01bc11b-384e-418e-be43-e12d0a845a24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1628.204505] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1629.800242] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1629.800589] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1630.795598] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1630.800321] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1630.800684] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1631.800948] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1631.801337] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1633.801283] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1633.801665] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1633.801665] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1633.825403] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.825597] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.825692] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18d6df82-a19a-499a-8874-171218569651] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.825841] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.825993] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.826133] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.826256] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.826375] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.826491] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.826606] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1633.826722] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1642.821588] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1648.600287] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1648.600837] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.703297] env[67964]: WARNING oslo_vmware.rw_handles [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1652.703297] env[67964]: ERROR oslo_vmware.rw_handles [ 1652.703883] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1652.706053] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1652.706337] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Copying Virtual Disk [datastore1] vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/e1242861-220c-4f46-820b-e5ed5b6a4b2c/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1652.707040] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-29e49a58-2eba-4329-945a-e703b41b4736 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.714686] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 1652.714686] env[67964]: value = "task-3456851" [ 1652.714686] env[67964]: _type = "Task" [ 1652.714686] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1652.722494] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456851, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1653.225484] env[67964]: DEBUG oslo_vmware.exceptions [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1653.225762] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1653.226325] env[67964]: ERROR nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1653.226325] env[67964]: Faults: ['InvalidArgument'] [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Traceback (most recent call last): [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] yield resources [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self.driver.spawn(context, instance, image_meta, [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self._fetch_image_if_missing(context, vi) [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1653.226325] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] image_cache(vi, tmp_image_ds_loc) [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] vm_util.copy_virtual_disk( [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] session._wait_for_task(vmdk_copy_task) [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] return self.wait_for_task(task_ref) [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] return evt.wait() [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] result = hub.switch() [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] return self.greenlet.switch() [ 1653.226699] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1653.227032] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self.f(*self.args, **self.kw) [ 1653.227032] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1653.227032] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] raise exceptions.translate_fault(task_info.error) [ 1653.227032] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1653.227032] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Faults: ['InvalidArgument'] [ 1653.227032] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] [ 1653.227032] env[67964]: INFO nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Terminating instance [ 1653.228207] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1653.228416] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1653.228658] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-be5413e1-f98e-4179-8023-796bd3c89992 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.230766] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1653.231103] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1653.231834] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c9e64e0-9c71-47d5-b4ed-84621dd7b587 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.238649] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1653.238830] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1f1857c0-1cdd-45ce-a795-f569c781e480 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.241015] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1653.241185] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1653.242227] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-45b3dc55-abb9-4542-a905-644ed5918e9d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.246622] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Waiting for the task: (returnval){ [ 1653.246622] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ad5977-d2bc-6b93-d67c-1857a8be7cbf" [ 1653.246622] env[67964]: _type = "Task" [ 1653.246622] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1653.255052] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ad5977-d2bc-6b93-d67c-1857a8be7cbf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1653.312110] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1653.312353] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1653.312507] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleting the datastore file [datastore1] 9cd7ef82-147a-4303-a773-32b161f819ef {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1653.312762] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3abb8c6e-8e5b-4739-a305-fcb0d953a878 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.319109] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 1653.319109] env[67964]: value = "task-3456853" [ 1653.319109] env[67964]: _type = "Task" [ 1653.319109] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1653.326722] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456853, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1653.757180] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1653.757532] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Creating directory with path [datastore1] vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1653.757664] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-32816610-d190-4123-b2d9-eb7d8730b065 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.768506] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Created directory with path [datastore1] vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1653.768683] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Fetch image to [datastore1] vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1653.768862] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1653.769539] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5f10c62-c7b1-4268-9f7e-e79a1484e36a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.775632] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e59ccb-c1c1-4210-b320-5c9aefa2155c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.784293] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-980385b9-aa3f-491f-808c-8e8226fa2508 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.813355] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5bc843b-ce79-4862-a754-b50d2184160a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.818486] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-516ea7f7-cec2-4866-8e6e-5eef0dab9a4f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.826935] env[67964]: DEBUG oslo_vmware.api [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456853, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067642} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1653.827183] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1653.827360] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1653.827526] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1653.827692] env[67964]: INFO nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1653.829745] env[67964]: DEBUG nova.compute.claims [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1653.829908] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1653.830151] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1653.842696] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1653.891714] env[67964]: DEBUG oslo_vmware.rw_handles [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1653.949925] env[67964]: DEBUG oslo_vmware.rw_handles [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1653.950144] env[67964]: DEBUG oslo_vmware.rw_handles [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1654.071853] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c9ef918-e2cd-4320-a4e3-31f71578989c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.079343] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ee99993-827d-4bad-b953-5cdffded3de9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.108362] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fbabc0c-cc4d-4c30-830a-2bedd659eed8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.114940] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e9f643b-e96c-4fae-a379-d0ee0f5a2057 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.127323] env[67964]: DEBUG nova.compute.provider_tree [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1654.135691] env[67964]: DEBUG nova.scheduler.client.report [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1654.150181] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.320s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.150713] env[67964]: ERROR nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1654.150713] env[67964]: Faults: ['InvalidArgument'] [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Traceback (most recent call last): [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self.driver.spawn(context, instance, image_meta, [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self._fetch_image_if_missing(context, vi) [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] image_cache(vi, tmp_image_ds_loc) [ 1654.150713] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] vm_util.copy_virtual_disk( [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] session._wait_for_task(vmdk_copy_task) [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] return self.wait_for_task(task_ref) [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] return evt.wait() [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] result = hub.switch() [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] return self.greenlet.switch() [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1654.151126] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] self.f(*self.args, **self.kw) [ 1654.151516] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1654.151516] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] raise exceptions.translate_fault(task_info.error) [ 1654.151516] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1654.151516] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Faults: ['InvalidArgument'] [ 1654.151516] env[67964]: ERROR nova.compute.manager [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] [ 1654.151516] env[67964]: DEBUG nova.compute.utils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1654.152774] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Build of instance 9cd7ef82-147a-4303-a773-32b161f819ef was re-scheduled: A specified parameter was not correct: fileType [ 1654.152774] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1654.153160] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1654.153330] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1654.153495] env[67964]: DEBUG nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1654.153651] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1654.423938] env[67964]: DEBUG nova.network.neutron [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1654.437404] env[67964]: INFO nova.compute.manager [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Took 0.28 seconds to deallocate network for instance. [ 1654.535846] env[67964]: INFO nova.scheduler.client.report [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleted allocations for instance 9cd7ef82-147a-4303-a773-32b161f819ef [ 1654.559888] env[67964]: DEBUG oslo_concurrency.lockutils [None req-01b2abe0-e784-4456-a8c7-a5c3b4ecf532 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "9cd7ef82-147a-4303-a773-32b161f819ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.433s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.561745] env[67964]: DEBUG oslo_concurrency.lockutils [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "9cd7ef82-147a-4303-a773-32b161f819ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.105s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.561745] env[67964]: DEBUG oslo_concurrency.lockutils [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "9cd7ef82-147a-4303-a773-32b161f819ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1654.561745] env[67964]: DEBUG oslo_concurrency.lockutils [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "9cd7ef82-147a-4303-a773-32b161f819ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.561745] env[67964]: DEBUG oslo_concurrency.lockutils [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "9cd7ef82-147a-4303-a773-32b161f819ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.563679] env[67964]: INFO nova.compute.manager [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Terminating instance [ 1654.565315] env[67964]: DEBUG nova.compute.manager [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1654.565509] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1654.565961] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-16ae0b96-7702-4149-8095-995a89111d54 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.576385] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-010ed676-3746-4cd8-867f-c2555841bcc9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.589590] env[67964]: DEBUG nova.compute.manager [None req-bbfcd055-21e0-44fb-bbed-dbe19ce90b79 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 76ddefb8-a93f-483a-9487-bc05f5dfef3f] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1654.612414] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9cd7ef82-147a-4303-a773-32b161f819ef could not be found. [ 1654.612645] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1654.612829] env[67964]: INFO nova.compute.manager [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1654.613096] env[67964]: DEBUG oslo.service.loopingcall [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1654.613313] env[67964]: DEBUG nova.compute.manager [-] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1654.613409] env[67964]: DEBUG nova.network.neutron [-] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1654.616716] env[67964]: DEBUG nova.compute.manager [None req-bbfcd055-21e0-44fb-bbed-dbe19ce90b79 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 76ddefb8-a93f-483a-9487-bc05f5dfef3f] Instance disappeared before build. {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1654.636157] env[67964]: DEBUG nova.network.neutron [-] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1654.638463] env[67964]: DEBUG oslo_concurrency.lockutils [None req-bbfcd055-21e0-44fb-bbed-dbe19ce90b79 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "76ddefb8-a93f-483a-9487-bc05f5dfef3f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.108s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.647265] env[67964]: INFO nova.compute.manager [-] [instance: 9cd7ef82-147a-4303-a773-32b161f819ef] Took 0.03 seconds to deallocate network for instance. [ 1654.649861] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1654.705896] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1654.706160] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.707715] env[67964]: INFO nova.compute.claims [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1654.751555] env[67964]: DEBUG oslo_concurrency.lockutils [None req-358426c2-f070-4f75-ac39-36ca453f39a6 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "9cd7ef82-147a-4303-a773-32b161f819ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.190s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.911864] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04d19888-e113-4c58-aef2-e66cf342b4b5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.919536] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7a6cbbf-dd65-40c7-8238-52ff275930a1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.948917] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96aae158-ccb5-4c38-a56b-d59c03709ec5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.955657] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e79b007b-d6d4-4c1c-b0e9-1e18439db3f2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.968880] env[67964]: DEBUG nova.compute.provider_tree [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1654.976996] env[67964]: DEBUG nova.scheduler.client.report [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1654.989660] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.990139] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1655.025722] env[67964]: DEBUG nova.compute.utils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1655.027920] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1655.028129] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1655.036807] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1655.089127] env[67964]: DEBUG nova.policy [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9fc77d3396842ed87ae657b8d6e1dbc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67838ada47314689881a641ad7dcf20e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1655.103775] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1655.128959] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1655.129261] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1655.129427] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1655.129606] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1655.129749] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1655.129891] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1655.130106] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1655.130263] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1655.130426] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1655.130593] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1655.130760] env[67964]: DEBUG nova.virt.hardware [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1655.131621] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5a3af44-c8ad-477e-bde1-b1e4b3d5de69 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1655.139722] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b547f13d-4234-4fc0-9f94-d1fc9cf898d2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1655.401868] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Successfully created port: ac6e7c4f-99d9-448e-a740-edb99b3115a1 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1656.273436] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Successfully updated port: ac6e7c4f-99d9-448e-a740-edb99b3115a1 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1656.287475] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "refresh_cache-430cad73-6b2c-4702-96a0-672f5b4c219f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1656.287627] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired lock "refresh_cache-430cad73-6b2c-4702-96a0-672f5b4c219f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1656.287776] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1656.325527] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1656.506725] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Updating instance_info_cache with network_info: [{"id": "ac6e7c4f-99d9-448e-a740-edb99b3115a1", "address": "fa:16:3e:ad:45:02", "network": {"id": "8d0d0ce9-0998-4981-ab81-2a7595742174", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-353799566-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "67838ada47314689881a641ad7dcf20e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3739ba33-c119-432c-9aee-80a62864317d", "external-id": "nsx-vlan-transportzone-474", "segmentation_id": 474, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapac6e7c4f-99", "ovs_interfaceid": "ac6e7c4f-99d9-448e-a740-edb99b3115a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1656.517741] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Releasing lock "refresh_cache-430cad73-6b2c-4702-96a0-672f5b4c219f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1656.518044] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Instance network_info: |[{"id": "ac6e7c4f-99d9-448e-a740-edb99b3115a1", "address": "fa:16:3e:ad:45:02", "network": {"id": "8d0d0ce9-0998-4981-ab81-2a7595742174", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-353799566-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "67838ada47314689881a641ad7dcf20e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3739ba33-c119-432c-9aee-80a62864317d", "external-id": "nsx-vlan-transportzone-474", "segmentation_id": 474, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapac6e7c4f-99", "ovs_interfaceid": "ac6e7c4f-99d9-448e-a740-edb99b3115a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1656.518437] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:45:02', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3739ba33-c119-432c-9aee-80a62864317d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ac6e7c4f-99d9-448e-a740-edb99b3115a1', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1656.526434] env[67964]: DEBUG oslo.service.loopingcall [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1656.526909] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1656.527157] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c6a86a97-080f-405d-ab34-ef88ee01413b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.547145] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1656.547145] env[67964]: value = "task-3456854" [ 1656.547145] env[67964]: _type = "Task" [ 1656.547145] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1656.554568] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456854, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1656.574942] env[67964]: DEBUG nova.compute.manager [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Received event network-vif-plugged-ac6e7c4f-99d9-448e-a740-edb99b3115a1 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1656.575270] env[67964]: DEBUG oslo_concurrency.lockutils [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] Acquiring lock "430cad73-6b2c-4702-96a0-672f5b4c219f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1656.575394] env[67964]: DEBUG oslo_concurrency.lockutils [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1656.575602] env[67964]: DEBUG oslo_concurrency.lockutils [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1656.575739] env[67964]: DEBUG nova.compute.manager [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] No waiting events found dispatching network-vif-plugged-ac6e7c4f-99d9-448e-a740-edb99b3115a1 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1656.575813] env[67964]: WARNING nova.compute.manager [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Received unexpected event network-vif-plugged-ac6e7c4f-99d9-448e-a740-edb99b3115a1 for instance with vm_state building and task_state spawning. [ 1656.575974] env[67964]: DEBUG nova.compute.manager [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Received event network-changed-ac6e7c4f-99d9-448e-a740-edb99b3115a1 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1656.576308] env[67964]: DEBUG nova.compute.manager [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Refreshing instance network info cache due to event network-changed-ac6e7c4f-99d9-448e-a740-edb99b3115a1. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1656.576509] env[67964]: DEBUG oslo_concurrency.lockutils [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] Acquiring lock "refresh_cache-430cad73-6b2c-4702-96a0-672f5b4c219f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1656.576646] env[67964]: DEBUG oslo_concurrency.lockutils [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] Acquired lock "refresh_cache-430cad73-6b2c-4702-96a0-672f5b4c219f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1656.576825] env[67964]: DEBUG nova.network.neutron [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Refreshing network info cache for port ac6e7c4f-99d9-448e-a740-edb99b3115a1 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1656.860136] env[67964]: DEBUG nova.network.neutron [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Updated VIF entry in instance network info cache for port ac6e7c4f-99d9-448e-a740-edb99b3115a1. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1656.860499] env[67964]: DEBUG nova.network.neutron [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Updating instance_info_cache with network_info: [{"id": "ac6e7c4f-99d9-448e-a740-edb99b3115a1", "address": "fa:16:3e:ad:45:02", "network": {"id": "8d0d0ce9-0998-4981-ab81-2a7595742174", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-353799566-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "67838ada47314689881a641ad7dcf20e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3739ba33-c119-432c-9aee-80a62864317d", "external-id": "nsx-vlan-transportzone-474", "segmentation_id": 474, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapac6e7c4f-99", "ovs_interfaceid": "ac6e7c4f-99d9-448e-a740-edb99b3115a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1656.869560] env[67964]: DEBUG oslo_concurrency.lockutils [req-28cd76e1-3dbe-4246-aa21-a39b64c2ac21 req-92d5e55a-6194-4f7d-8745-2308bfc220d4 service nova] Releasing lock "refresh_cache-430cad73-6b2c-4702-96a0-672f5b4c219f" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1657.058915] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456854, 'name': CreateVM_Task, 'duration_secs': 0.297503} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1657.059724] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1657.059941] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1657.060122] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1657.060437] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1657.060688] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-31a5a04b-040b-4c1e-a9c1-0dd9d78e9e19 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.065308] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 1657.065308] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52267c28-e258-f5e6-ac15-19ad6db8524b" [ 1657.065308] env[67964]: _type = "Task" [ 1657.065308] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1657.077819] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52267c28-e258-f5e6-ac15-19ad6db8524b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1657.576794] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1657.577180] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1657.577283] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1684.800321] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1684.818194] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1684.818415] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1684.818580] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1684.818731] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1684.819873] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc5e7d30-5f93-440b-a097-9115c7a88a42 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1684.828680] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58969a47-2bb6-4bd6-bf8e-aad1236500da {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1684.842417] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa13e5a-14ef-4f99-b1c5-b9c8dbcb8fb8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1684.848486] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-483b97e5-bc7d-4311-a124-1d58a9d09995 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1684.876848] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180941MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1684.877013] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1684.877175] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1685.008731] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.008893] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009059] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009198] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009320] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009439] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009555] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009670] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009788] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.009897] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1685.021034] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1685.032851] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1685.043269] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1685.043503] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1685.043652] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1685.188864] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26be534d-2e1e-4f10-8ed6-2a38ac98e349 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.196128] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29b75d46-a367-4550-b0b9-c0d8113656a7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.225460] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-636b30ce-3b0b-4abb-aa27-c4457c1156ae {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.232192] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a253225-a0e1-4899-893a-ea8be0c099ff {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.244453] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1685.252743] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1685.268445] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1685.268628] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.391s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1690.268694] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1690.269082] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1690.269208] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1691.802583] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1691.802988] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1692.796476] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1692.800095] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1692.800291] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1695.801897] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1695.801897] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1695.802272] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1695.822694] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.822859] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 18d6df82-a19a-499a-8874-171218569651] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.822989] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.823180] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.823311] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.823434] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.823563] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.823680] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.823798] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.823916] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1695.824044] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1700.517555] env[67964]: WARNING oslo_vmware.rw_handles [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1700.517555] env[67964]: ERROR oslo_vmware.rw_handles [ 1700.518651] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1700.521422] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1700.521804] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Copying Virtual Disk [datastore1] vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/00854976-c922-46ac-b6de-385fbc780959/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1700.522228] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2fb38469-ff41-48a7-a7ec-8ff4d0aa423a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.532017] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Waiting for the task: (returnval){ [ 1700.532017] env[67964]: value = "task-3456855" [ 1700.532017] env[67964]: _type = "Task" [ 1700.532017] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1700.538392] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Task: {'id': task-3456855, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1701.044024] env[67964]: DEBUG oslo_vmware.exceptions [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1701.044024] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1701.044024] env[67964]: ERROR nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1701.044024] env[67964]: Faults: ['InvalidArgument'] [ 1701.044024] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Traceback (most recent call last): [ 1701.044024] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1701.044024] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] yield resources [ 1701.044024] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self.driver.spawn(context, instance, image_meta, [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self._fetch_image_if_missing(context, vi) [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] image_cache(vi, tmp_image_ds_loc) [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] vm_util.copy_virtual_disk( [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] session._wait_for_task(vmdk_copy_task) [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1701.044452] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] return self.wait_for_task(task_ref) [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] return evt.wait() [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] result = hub.switch() [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] return self.greenlet.switch() [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self.f(*self.args, **self.kw) [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] raise exceptions.translate_fault(task_info.error) [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Faults: ['InvalidArgument'] [ 1701.044862] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] [ 1701.045298] env[67964]: INFO nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Terminating instance [ 1701.045298] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1701.045500] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1701.045737] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-90d32f36-3a4e-4a68-8b80-38d34c0478aa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.047866] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1701.048068] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1701.048788] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcae266a-b76c-4122-8a33-5d1d220d420e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.055541] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1701.055745] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ec0a1c92-8c42-423f-b939-cea802b04afb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.057844] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1701.058498] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1701.058941] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-df2e2cd9-a194-4248-be00-bee2f33e0b8f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.063393] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Waiting for the task: (returnval){ [ 1701.063393] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ee7640-c168-1bea-0de8-d43b9fabbea0" [ 1701.063393] env[67964]: _type = "Task" [ 1701.063393] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1701.070510] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ee7640-c168-1bea-0de8-d43b9fabbea0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1701.126534] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1701.126741] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1701.126913] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Deleting the datastore file [datastore1] 18c148fb-1cd4-4537-9b77-089e9b272f83 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1701.127196] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-47544d3a-0eed-42f9-bb1d-a3080d27a774 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.132940] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Waiting for the task: (returnval){ [ 1701.132940] env[67964]: value = "task-3456857" [ 1701.132940] env[67964]: _type = "Task" [ 1701.132940] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1701.140461] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Task: {'id': task-3456857, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1701.573268] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1701.573559] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Creating directory with path [datastore1] vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1701.573740] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-31b28794-9b8f-4503-9edd-3788bfeb172c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.584984] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Created directory with path [datastore1] vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1701.585206] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Fetch image to [datastore1] vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1701.585386] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1701.586115] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ed9fc5d-ce01-466a-8cbb-cad4b256a143 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.592548] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb2309e1-a72d-45a7-8d7f-98e4392aaf3d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.601454] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0da29a3-5454-46cc-878c-e177ab73539a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.631589] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bce1ab3c-8e7e-40f1-90c7-b21cea01a887 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.642502] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8e88c370-a053-40ea-871e-2dfc63778001 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.644162] env[67964]: DEBUG oslo_vmware.api [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Task: {'id': task-3456857, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071401} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1701.644403] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1701.644583] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1701.644748] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1701.644933] env[67964]: INFO nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1701.647013] env[67964]: DEBUG nova.compute.claims [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1701.647197] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1701.647429] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1701.663642] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1701.851171] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1701.910360] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1701.910554] env[67964]: DEBUG oslo_vmware.rw_handles [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1701.919342] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c3ce114-3e2f-432d-bf33-fb42da0c5e82 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.926767] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3ed9002-6f53-4952-ac6d-c9d771ae151d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.955902] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79a517be-f626-43fd-8d45-b9ca2e1cc449 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.962670] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1cf54c4-77af-4d91-9c7b-2ecf6a6b4c2f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.975376] env[67964]: DEBUG nova.compute.provider_tree [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1701.983889] env[67964]: DEBUG nova.scheduler.client.report [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1701.999413] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.352s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.999977] env[67964]: ERROR nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1701.999977] env[67964]: Faults: ['InvalidArgument'] [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Traceback (most recent call last): [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self.driver.spawn(context, instance, image_meta, [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self._fetch_image_if_missing(context, vi) [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] image_cache(vi, tmp_image_ds_loc) [ 1701.999977] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] vm_util.copy_virtual_disk( [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] session._wait_for_task(vmdk_copy_task) [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] return self.wait_for_task(task_ref) [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] return evt.wait() [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] result = hub.switch() [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] return self.greenlet.switch() [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1702.000377] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] self.f(*self.args, **self.kw) [ 1702.000780] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1702.000780] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] raise exceptions.translate_fault(task_info.error) [ 1702.000780] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1702.000780] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Faults: ['InvalidArgument'] [ 1702.000780] env[67964]: ERROR nova.compute.manager [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] [ 1702.000780] env[67964]: DEBUG nova.compute.utils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1702.002101] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Build of instance 18c148fb-1cd4-4537-9b77-089e9b272f83 was re-scheduled: A specified parameter was not correct: fileType [ 1702.002101] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1702.002516] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1702.002689] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1702.002855] env[67964]: DEBUG nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1702.003081] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1702.512027] env[67964]: DEBUG nova.network.neutron [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1702.527510] env[67964]: INFO nova.compute.manager [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Took 0.52 seconds to deallocate network for instance. [ 1702.644145] env[67964]: INFO nova.scheduler.client.report [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Deleted allocations for instance 18c148fb-1cd4-4537-9b77-089e9b272f83 [ 1702.676541] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fc7a66e5-0757-4013-b848-79b3098d5fb9 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 563.542s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1702.677718] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 366.941s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1702.677939] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Acquiring lock "18c148fb-1cd4-4537-9b77-089e9b272f83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1702.678156] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1702.678337] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1702.683073] env[67964]: INFO nova.compute.manager [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Terminating instance [ 1702.684788] env[67964]: DEBUG nova.compute.manager [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1702.684983] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1702.685256] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-23e73ba0-7ced-4b49-89c5-94ffa59cb1bb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.691688] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1702.698027] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-679cf7a0-e70c-450c-b045-343a204db75f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.727929] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 18c148fb-1cd4-4537-9b77-089e9b272f83 could not be found. [ 1702.728170] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1702.728353] env[67964]: INFO nova.compute.manager [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1702.728639] env[67964]: DEBUG oslo.service.loopingcall [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1702.731343] env[67964]: DEBUG nova.compute.manager [-] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1702.731450] env[67964]: DEBUG nova.network.neutron [-] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1702.747149] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1702.747208] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1702.748945] env[67964]: INFO nova.compute.claims [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1702.758649] env[67964]: DEBUG nova.network.neutron [-] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1702.770810] env[67964]: INFO nova.compute.manager [-] [instance: 18c148fb-1cd4-4537-9b77-089e9b272f83] Took 0.04 seconds to deallocate network for instance. [ 1702.878061] env[67964]: DEBUG oslo_concurrency.lockutils [None req-d938c483-6950-480b-b7b4-a14de72cce89 tempest-ServerRescueTestJSONUnderV235-963635161 tempest-ServerRescueTestJSONUnderV235-963635161-project-member] Lock "18c148fb-1cd4-4537-9b77-089e9b272f83" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1702.937997] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19f1a4e1-6d0b-4461-b2b7-d3aa721356c3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.945884] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74c6da0a-e2e4-4233-b8ac-2383cfaaec14 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.976960] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ee65e47-1b6e-4bcc-b05d-465f7c8db7e5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.983530] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fb79c43-44c7-4469-9684-f23a33333edc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.996115] env[67964]: DEBUG nova.compute.provider_tree [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1703.004100] env[67964]: DEBUG nova.scheduler.client.report [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1703.021233] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1703.021713] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1703.062722] env[67964]: DEBUG nova.compute.utils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1703.063966] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1703.064213] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1703.077319] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1703.147666] env[67964]: DEBUG nova.policy [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cb0fcc8c390a4451a06d2ff90ef85253', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b28e13db1c6747e9b6c9fef34def6923', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1703.150666] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1703.183460] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1703.183696] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1703.183848] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1703.184037] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1703.184184] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1703.184330] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1703.184533] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1703.184688] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1703.184851] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1703.185019] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1703.185194] env[67964]: DEBUG nova.virt.hardware [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1703.186266] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ffef66c-101c-4364-a58f-b3b1d7d9ca3b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.194368] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f050ae2b-11e6-4d4c-a624-6145cc0d3e97 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.522157] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Successfully created port: d8995b08-de5e-4450-b4a1-63f701cf1b4a {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1704.362230] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Successfully updated port: d8995b08-de5e-4450-b4a1-63f701cf1b4a {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1704.381078] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "refresh_cache-bc98edf7-889e-4814-b859-d860033ba0cd" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1704.381275] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired lock "refresh_cache-bc98edf7-889e-4814-b859-d860033ba0cd" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1704.381449] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1704.420623] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1704.575423] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Updating instance_info_cache with network_info: [{"id": "d8995b08-de5e-4450-b4a1-63f701cf1b4a", "address": "fa:16:3e:04:16:f3", "network": {"id": "ffcd87e1-0022-450d-8ac9-578aa689bbc3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130940161-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b28e13db1c6747e9b6c9fef34def6923", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "489b2441-7132-4942-8b61-49cf0ad4400e", "external-id": "nsx-vlan-transportzone-971", "segmentation_id": 971, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8995b08-de", "ovs_interfaceid": "d8995b08-de5e-4450-b4a1-63f701cf1b4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1704.587935] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Releasing lock "refresh_cache-bc98edf7-889e-4814-b859-d860033ba0cd" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1704.588239] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Instance network_info: |[{"id": "d8995b08-de5e-4450-b4a1-63f701cf1b4a", "address": "fa:16:3e:04:16:f3", "network": {"id": "ffcd87e1-0022-450d-8ac9-578aa689bbc3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130940161-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b28e13db1c6747e9b6c9fef34def6923", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "489b2441-7132-4942-8b61-49cf0ad4400e", "external-id": "nsx-vlan-transportzone-971", "segmentation_id": 971, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8995b08-de", "ovs_interfaceid": "d8995b08-de5e-4450-b4a1-63f701cf1b4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1704.588621] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:04:16:f3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '489b2441-7132-4942-8b61-49cf0ad4400e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd8995b08-de5e-4450-b4a1-63f701cf1b4a', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1704.596142] env[67964]: DEBUG oslo.service.loopingcall [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1704.596587] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1704.597154] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-44a9504b-f1da-47ef-856e-9f010c54853c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1704.617042] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1704.617042] env[67964]: value = "task-3456858" [ 1704.617042] env[67964]: _type = "Task" [ 1704.617042] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1704.625435] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456858, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1704.643848] env[67964]: DEBUG nova.compute.manager [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Received event network-vif-plugged-d8995b08-de5e-4450-b4a1-63f701cf1b4a {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1704.644079] env[67964]: DEBUG oslo_concurrency.lockutils [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] Acquiring lock "bc98edf7-889e-4814-b859-d860033ba0cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1704.644280] env[67964]: DEBUG oslo_concurrency.lockutils [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] Lock "bc98edf7-889e-4814-b859-d860033ba0cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1704.644442] env[67964]: DEBUG oslo_concurrency.lockutils [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] Lock "bc98edf7-889e-4814-b859-d860033ba0cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1704.644647] env[67964]: DEBUG nova.compute.manager [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] No waiting events found dispatching network-vif-plugged-d8995b08-de5e-4450-b4a1-63f701cf1b4a {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1704.644753] env[67964]: WARNING nova.compute.manager [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Received unexpected event network-vif-plugged-d8995b08-de5e-4450-b4a1-63f701cf1b4a for instance with vm_state building and task_state spawning. [ 1704.644905] env[67964]: DEBUG nova.compute.manager [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Received event network-changed-d8995b08-de5e-4450-b4a1-63f701cf1b4a {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1704.645061] env[67964]: DEBUG nova.compute.manager [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Refreshing instance network info cache due to event network-changed-d8995b08-de5e-4450-b4a1-63f701cf1b4a. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1704.645241] env[67964]: DEBUG oslo_concurrency.lockutils [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] Acquiring lock "refresh_cache-bc98edf7-889e-4814-b859-d860033ba0cd" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1704.645371] env[67964]: DEBUG oslo_concurrency.lockutils [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] Acquired lock "refresh_cache-bc98edf7-889e-4814-b859-d860033ba0cd" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1704.645517] env[67964]: DEBUG nova.network.neutron [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Refreshing network info cache for port d8995b08-de5e-4450-b4a1-63f701cf1b4a {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1705.078053] env[67964]: DEBUG nova.network.neutron [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Updated VIF entry in instance network info cache for port d8995b08-de5e-4450-b4a1-63f701cf1b4a. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1705.078365] env[67964]: DEBUG nova.network.neutron [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Updating instance_info_cache with network_info: [{"id": "d8995b08-de5e-4450-b4a1-63f701cf1b4a", "address": "fa:16:3e:04:16:f3", "network": {"id": "ffcd87e1-0022-450d-8ac9-578aa689bbc3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130940161-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b28e13db1c6747e9b6c9fef34def6923", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "489b2441-7132-4942-8b61-49cf0ad4400e", "external-id": "nsx-vlan-transportzone-971", "segmentation_id": 971, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8995b08-de", "ovs_interfaceid": "d8995b08-de5e-4450-b4a1-63f701cf1b4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1705.089656] env[67964]: DEBUG oslo_concurrency.lockutils [req-0d6cab42-5190-479f-956f-7601e4934a66 req-0bae489b-0cd9-4091-b837-88f85cb83638 service nova] Releasing lock "refresh_cache-bc98edf7-889e-4814-b859-d860033ba0cd" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1705.128059] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456858, 'name': CreateVM_Task, 'duration_secs': 0.281333} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1705.128059] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1705.128222] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1705.128461] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1705.128823] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1705.129113] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a54e040c-81a5-4c46-b19f-45445074dc7e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1705.133605] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 1705.133605] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52785b83-24b7-b7be-8787-20e56163a3bf" [ 1705.133605] env[67964]: _type = "Task" [ 1705.133605] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1705.141137] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52785b83-24b7-b7be-8787-20e56163a3bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1705.643559] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1705.643912] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1705.644044] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1740.802057] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1740.802057] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 1740.821890] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] There are 0 instances to clean {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 1745.821409] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1745.832898] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1745.833130] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1745.833300] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1745.833456] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1745.834555] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c175398-7398-42c8-96cd-26e1417b2d12 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.843459] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d1eedfa-6de0-40a9-beb9-221f79cdc4a4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.858382] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1021d671-acbb-4263-85cb-b3b7c00ac022 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.864784] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b37bfaa5-cb30-4720-b86b-290268114112 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.893016] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180912MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1745.893179] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1745.893366] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1745.966611] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 18d6df82-a19a-499a-8874-171218569651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.966738] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ee34b117-806d-4cc4-98b7-0f40f074cfab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.966867] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.966992] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.967129] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.967248] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.967365] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.967478] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.967594] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.967709] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1745.978664] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1745.988946] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1745.989189] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1745.989348] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1746.124994] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e164a4-4eff-4a05-927d-f32358a097eb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.133013] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d846758-c63b-4218-9e74-ccf8a04c2497 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.163274] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e2431ad-c789-4528-b8cc-29e5e3f18fa6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.170085] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab25c5a2-ad25-4b23-9f0b-a572705d62a4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.183066] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1746.190868] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1746.204228] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1746.204382] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.311s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.621528] env[67964]: WARNING oslo_vmware.rw_handles [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1749.621528] env[67964]: ERROR oslo_vmware.rw_handles [ 1749.622230] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1749.623947] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1749.624270] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Copying Virtual Disk [datastore1] vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/7b7d646e-b313-40b4-9540-3ec89e8e919e/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1749.624592] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4b7adb16-1e68-46ee-8a6a-a04245b9e434 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.633798] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Waiting for the task: (returnval){ [ 1749.633798] env[67964]: value = "task-3456859" [ 1749.633798] env[67964]: _type = "Task" [ 1749.633798] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1749.641476] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Task: {'id': task-3456859, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.144721] env[67964]: DEBUG oslo_vmware.exceptions [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1750.145263] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1750.146054] env[67964]: ERROR nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1750.146054] env[67964]: Faults: ['InvalidArgument'] [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] Traceback (most recent call last): [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] yield resources [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self.driver.spawn(context, instance, image_meta, [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self._fetch_image_if_missing(context, vi) [ 1750.146054] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] image_cache(vi, tmp_image_ds_loc) [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] vm_util.copy_virtual_disk( [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] session._wait_for_task(vmdk_copy_task) [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] return self.wait_for_task(task_ref) [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] return evt.wait() [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] result = hub.switch() [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1750.146514] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] return self.greenlet.switch() [ 1750.147117] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1750.147117] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self.f(*self.args, **self.kw) [ 1750.147117] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1750.147117] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] raise exceptions.translate_fault(task_info.error) [ 1750.147117] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1750.147117] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] Faults: ['InvalidArgument'] [ 1750.147117] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] [ 1750.147117] env[67964]: INFO nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Terminating instance [ 1750.148529] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1750.148733] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1750.148971] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bc3eacf6-1b2d-48f5-8d42-0966662022f7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.151119] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1750.151315] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1750.152040] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-474da16f-5ac2-408d-a51a-193cea2abda4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.158712] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1750.158943] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89eb6712-072e-44f5-85d3-d045b4a754aa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.161015] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1750.161199] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1750.162204] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f46748f-d4b6-4c62-9afa-f3ac76f6c325 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.167391] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 1750.167391] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52f85dd9-1df0-eff8-a4e4-73c06c28b489" [ 1750.167391] env[67964]: _type = "Task" [ 1750.167391] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.175045] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52f85dd9-1df0-eff8-a4e4-73c06c28b489, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.183505] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1750.183657] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1750.225868] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1750.225868] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1750.225868] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Deleting the datastore file [datastore1] 18d6df82-a19a-499a-8874-171218569651 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1750.225868] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-53646c6f-8c9c-4483-8372-e35396efc0e7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.232212] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Waiting for the task: (returnval){ [ 1750.232212] env[67964]: value = "task-3456861" [ 1750.232212] env[67964]: _type = "Task" [ 1750.232212] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.239970] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Task: {'id': task-3456861, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.678033] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1750.678345] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating directory with path [datastore1] vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1750.678527] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2a899132-dbaa-413c-9690-db246f0efb38 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.690487] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Created directory with path [datastore1] vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1750.690676] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Fetch image to [datastore1] vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1750.690837] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1750.691887] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-261e8583-9567-45e4-b418-994e553924f5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.698751] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3edc37eb-0485-4f48-9933-20ea1e123de8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.707889] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6885925a-340f-4f99-8e15-4e786e43c409 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.742231] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cff24924-ff3a-4569-bbdd-0d2f4a99cc40 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.749721] env[67964]: DEBUG oslo_vmware.api [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Task: {'id': task-3456861, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078691} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1750.751143] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1750.751379] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1750.751589] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1750.751796] env[67964]: INFO nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1750.753530] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-46b0de33-2a68-4c5d-85e3-cfd1a2e2dd2c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.755358] env[67964]: DEBUG nova.compute.claims [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1750.755530] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.755751] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1750.778991] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1750.828305] env[67964]: DEBUG oslo_vmware.rw_handles [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1750.889077] env[67964]: DEBUG oslo_vmware.rw_handles [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1750.889271] env[67964]: DEBUG oslo_vmware.rw_handles [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1750.983424] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1bd8ce7-dc29-4eeb-a824-bbf6d5695595 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.991095] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bfcc1e4-f003-4c57-bad0-769707bcc16b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.022022] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d752f575-373e-40b8-a717-81d52f58d4b9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.028929] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d470639-8eb0-441b-876e-a3437ed6b36b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.041645] env[67964]: DEBUG nova.compute.provider_tree [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1751.049764] env[67964]: DEBUG nova.scheduler.client.report [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1751.064247] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.308s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1751.064770] env[67964]: ERROR nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1751.064770] env[67964]: Faults: ['InvalidArgument'] [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] Traceback (most recent call last): [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self.driver.spawn(context, instance, image_meta, [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self._fetch_image_if_missing(context, vi) [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] image_cache(vi, tmp_image_ds_loc) [ 1751.064770] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] vm_util.copy_virtual_disk( [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] session._wait_for_task(vmdk_copy_task) [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] return self.wait_for_task(task_ref) [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] return evt.wait() [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] result = hub.switch() [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] return self.greenlet.switch() [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1751.065198] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] self.f(*self.args, **self.kw) [ 1751.065593] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1751.065593] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] raise exceptions.translate_fault(task_info.error) [ 1751.065593] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1751.065593] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] Faults: ['InvalidArgument'] [ 1751.065593] env[67964]: ERROR nova.compute.manager [instance: 18d6df82-a19a-499a-8874-171218569651] [ 1751.065593] env[67964]: DEBUG nova.compute.utils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1751.066785] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Build of instance 18d6df82-a19a-499a-8874-171218569651 was re-scheduled: A specified parameter was not correct: fileType [ 1751.066785] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1751.067174] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1751.067343] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1751.067509] env[67964]: DEBUG nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1751.067667] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1751.538226] env[67964]: DEBUG nova.network.neutron [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1751.552274] env[67964]: INFO nova.compute.manager [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Took 0.48 seconds to deallocate network for instance. [ 1751.645636] env[67964]: INFO nova.scheduler.client.report [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Deleted allocations for instance 18d6df82-a19a-499a-8874-171218569651 [ 1751.675017] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a278c845-6e2a-4d09-aa24-8a83a4888e9c tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "18d6df82-a19a-499a-8874-171218569651" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 561.067s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1751.675163] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "18d6df82-a19a-499a-8874-171218569651" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 365.273s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1751.675385] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Acquiring lock "18d6df82-a19a-499a-8874-171218569651-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1751.675590] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "18d6df82-a19a-499a-8874-171218569651-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1751.675750] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "18d6df82-a19a-499a-8874-171218569651-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1751.680406] env[67964]: INFO nova.compute.manager [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Terminating instance [ 1751.683583] env[67964]: DEBUG nova.compute.manager [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1751.683936] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1751.684255] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2d360084-ad69-4bbe-8f15-dfe143baa104 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.693477] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c809a09-cd4d-44eb-8036-9b585b712400 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.705585] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1751.726446] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 18d6df82-a19a-499a-8874-171218569651 could not be found. [ 1751.726732] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1751.726825] env[67964]: INFO nova.compute.manager [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] [instance: 18d6df82-a19a-499a-8874-171218569651] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1751.727082] env[67964]: DEBUG oslo.service.loopingcall [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1751.727309] env[67964]: DEBUG nova.compute.manager [-] [instance: 18d6df82-a19a-499a-8874-171218569651] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1751.727404] env[67964]: DEBUG nova.network.neutron [-] [instance: 18d6df82-a19a-499a-8874-171218569651] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1751.756226] env[67964]: DEBUG nova.network.neutron [-] [instance: 18d6df82-a19a-499a-8874-171218569651] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1751.758676] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1751.758909] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1751.760337] env[67964]: INFO nova.compute.claims [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1751.764958] env[67964]: INFO nova.compute.manager [-] [instance: 18d6df82-a19a-499a-8874-171218569651] Took 0.04 seconds to deallocate network for instance. [ 1751.801995] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1751.802286] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1751.802443] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1751.872742] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6aceae69-5f34-4682-bf51-1f1ab0c540d9 tempest-ServersNegativeTestMultiTenantJSON-118916776 tempest-ServersNegativeTestMultiTenantJSON-118916776-project-member] Lock "18d6df82-a19a-499a-8874-171218569651" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.198s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1751.940636] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02e8230b-3b30-408e-ada1-4ac54aafc453 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.947744] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de94cd95-818b-4356-8b82-36b87f4bed66 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.976487] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6bd288f-187d-4375-89c1-1916eff1df49 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.983073] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c945d49-aa89-42dc-adb9-b9cd7334349c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.995505] env[67964]: DEBUG nova.compute.provider_tree [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1752.005067] env[67964]: DEBUG nova.scheduler.client.report [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1752.020781] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.021308] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1752.053446] env[67964]: DEBUG nova.compute.utils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1752.054645] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Not allocating networking since 'none' was specified. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 1752.064767] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1752.127179] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1752.157427] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1752.157669] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1752.157859] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1752.158059] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1752.158210] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1752.158369] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1752.158573] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1752.158729] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1752.158894] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1752.159072] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1752.159247] env[67964]: DEBUG nova.virt.hardware [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1752.160127] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-803fcf33-049a-4713-bc0a-4f7eda78b165 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.168164] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fa2fa72-6bad-40da-a256-f8c042e67512 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.182021] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance VIF info [] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1752.187444] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Creating folder: Project (e087f14a0c624ffcac2846a6c49dce5d). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1752.187696] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d68f96de-8305-4ca7-a612-02148b491f3b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.196791] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Created folder: Project (e087f14a0c624ffcac2846a6c49dce5d) in parent group-v690366. [ 1752.196968] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Creating folder: Instances. Parent ref: group-v690467. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1752.197187] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-057056ad-297f-4845-97a7-37cb78ffa0e1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.206441] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Created folder: Instances in parent group-v690467. [ 1752.206655] env[67964]: DEBUG oslo.service.loopingcall [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1752.206832] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1752.207017] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-93c572f6-5ad1-4d50-ac8b-0502645d2af2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.222113] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1752.222113] env[67964]: value = "task-3456864" [ 1752.222113] env[67964]: _type = "Task" [ 1752.222113] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1752.228779] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456864, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1752.732259] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456864, 'name': CreateVM_Task, 'duration_secs': 0.234965} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1752.732501] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1752.732769] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1752.732944] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1752.733297] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1752.733546] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e5f86638-c3eb-4678-9fd8-699215bf954c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.737737] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Waiting for the task: (returnval){ [ 1752.737737] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]528048f2-6600-1bda-6599-ca4db229619c" [ 1752.737737] env[67964]: _type = "Task" [ 1752.737737] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1752.745064] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]528048f2-6600-1bda-6599-ca4db229619c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1752.808739] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1752.809075] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1753.248440] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1753.248683] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1753.248896] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1753.801118] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1754.800673] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1756.800624] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1756.800920] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1756.800920] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1756.822465] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.822629] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.822760] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.822887] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.823015] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.823139] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.823258] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.823375] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.823489] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.823601] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1756.823718] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1764.800814] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1764.822196] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1764.822353] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances with incomplete migration {{(pid=67964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 1773.681774] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1773.702814] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Getting list of instances from cluster (obj){ [ 1773.702814] env[67964]: value = "domain-c8" [ 1773.702814] env[67964]: _type = "ClusterComputeResource" [ 1773.702814] env[67964]: } {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1773.704098] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1d48e7e-bf6c-41ea-8f1a-033b4fc3deb5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1773.723217] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Got total of 10 instances {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1773.723390] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid ee34b117-806d-4cc4-98b7-0f40f074cfab {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.723596] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 7825ba9e-8603-4211-b5fe-708276272464 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.723762] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid ec783231-6f62-4177-ba76-4ba688dda077 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.723978] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid ea5f3d40-6494-459a-a917-2602d0718d8c {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.724095] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid da8f11e2-6d58-4e28-aabb-9943bc657e60 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.724257] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 2c06844d-2c7f-4e27-b3c6-16dfd6047119 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.724416] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 41d93bf8-7991-4b52-8ebb-a1988dc627c1 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.724573] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 430cad73-6b2c-4702-96a0-672f5b4c219f {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.724733] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid bc98edf7-889e-4814-b859-d860033ba0cd {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.724890] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid c01bc11b-384e-418e-be43-e12d0a845a24 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1773.725280] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.725460] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "7825ba9e-8603-4211-b5fe-708276272464" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.725660] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "ec783231-6f62-4177-ba76-4ba688dda077" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.725863] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "ea5f3d40-6494-459a-a917-2602d0718d8c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.726078] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.726291] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.726493] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.726690] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "430cad73-6b2c-4702-96a0-672f5b4c219f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.726888] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "bc98edf7-889e-4814-b859-d860033ba0cd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.727104] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "c01bc11b-384e-418e-be43-e12d0a845a24" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1785.433604] env[67964]: DEBUG oslo_concurrency.lockutils [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "430cad73-6b2c-4702-96a0-672f5b4c219f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1787.542539] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "bc98edf7-889e-4814-b859-d860033ba0cd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1791.764141] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "07489f39-f57c-4528-80b8-b42056181b8b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1791.764486] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "07489f39-f57c-4528-80b8-b42056181b8b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1794.583649] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1794.583965] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1800.552284] env[67964]: WARNING oslo_vmware.rw_handles [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1800.552284] env[67964]: ERROR oslo_vmware.rw_handles [ 1800.553022] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1800.554817] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1800.555076] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Copying Virtual Disk [datastore1] vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/7049f13b-61a9-462e-ba05-b1590fa22551/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1800.555382] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7da9bd11-120c-4e6b-bd9e-0b864f225a3c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.563153] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 1800.563153] env[67964]: value = "task-3456865" [ 1800.563153] env[67964]: _type = "Task" [ 1800.563153] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1800.570802] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': task-3456865, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1801.075336] env[67964]: DEBUG oslo_vmware.exceptions [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1801.075663] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1801.076255] env[67964]: ERROR nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1801.076255] env[67964]: Faults: ['InvalidArgument'] [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Traceback (most recent call last): [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] yield resources [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self.driver.spawn(context, instance, image_meta, [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self._fetch_image_if_missing(context, vi) [ 1801.076255] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] image_cache(vi, tmp_image_ds_loc) [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] vm_util.copy_virtual_disk( [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] session._wait_for_task(vmdk_copy_task) [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] return self.wait_for_task(task_ref) [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] return evt.wait() [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] result = hub.switch() [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1801.076659] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] return self.greenlet.switch() [ 1801.077075] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1801.077075] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self.f(*self.args, **self.kw) [ 1801.077075] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1801.077075] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] raise exceptions.translate_fault(task_info.error) [ 1801.077075] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1801.077075] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Faults: ['InvalidArgument'] [ 1801.077075] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] [ 1801.077075] env[67964]: INFO nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Terminating instance [ 1801.078075] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1801.078319] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1801.078588] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa736b21-ce91-4203-b0f5-16dba01d4555 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.080772] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1801.080966] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1801.081687] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bfb6b5e-e5fa-4db3-9969-65d286724b8e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.088524] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1801.088737] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5f6f31e7-6978-4c4e-8fdb-7ef73046c543 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.090895] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1801.091080] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1801.092058] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-438550d8-95ad-4c6f-8336-a46a46529c86 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.096549] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 1801.096549] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]523357d7-7166-dd17-47b4-6a155410d2fb" [ 1801.096549] env[67964]: _type = "Task" [ 1801.096549] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1801.103822] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]523357d7-7166-dd17-47b4-6a155410d2fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1801.154637] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1801.154862] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1801.155052] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Deleting the datastore file [datastore1] ee34b117-806d-4cc4-98b7-0f40f074cfab {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1801.155332] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7e21d353-3513-4bc7-816d-71ac284cc0db {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.160766] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 1801.160766] env[67964]: value = "task-3456867" [ 1801.160766] env[67964]: _type = "Task" [ 1801.160766] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1801.168014] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': task-3456867, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1801.607075] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1801.607364] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating directory with path [datastore1] vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1801.607581] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ef104f1-3ff7-4267-8de1-8ca4fdaa498f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.618320] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Created directory with path [datastore1] vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1801.618490] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Fetch image to [datastore1] vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1801.618654] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1801.619372] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f075642e-b743-4955-8d49-175aaa4c280b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.625624] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ba5fef9-7443-4ce2-a012-a3a0aba1855c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.634353] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a32295c8-0d5a-4827-8f62-e70b7243a582 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.666012] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-734190ae-f944-48ec-a1a3-375cf6a8fe4a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.672654] env[67964]: DEBUG oslo_vmware.api [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': task-3456867, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074659} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1801.674039] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1801.674231] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1801.674413] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1801.674581] env[67964]: INFO nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1801.676290] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-74b4c863-8ec1-4eae-a926-592c1ae90d30 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.678127] env[67964]: DEBUG nova.compute.claims [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1801.678304] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1801.678508] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1801.699677] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1801.753590] env[67964]: DEBUG oslo_vmware.rw_handles [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1801.813044] env[67964]: DEBUG oslo_vmware.rw_handles [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1801.813044] env[67964]: DEBUG oslo_vmware.rw_handles [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1801.929930] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a1b5c2e-4b5f-41be-9ec7-d53e4bfe169f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.936923] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dfb25c0-646d-4b76-af46-e28108c51027 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.965952] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5249405-c137-4c6e-8345-3c90cb23cee4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.972915] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4161c09f-18f9-416e-b8d8-26e66c688cf2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.985660] env[67964]: DEBUG nova.compute.provider_tree [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1801.994210] env[67964]: DEBUG nova.scheduler.client.report [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1802.007813] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.329s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.008352] env[67964]: ERROR nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1802.008352] env[67964]: Faults: ['InvalidArgument'] [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Traceback (most recent call last): [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self.driver.spawn(context, instance, image_meta, [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self._fetch_image_if_missing(context, vi) [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] image_cache(vi, tmp_image_ds_loc) [ 1802.008352] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] vm_util.copy_virtual_disk( [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] session._wait_for_task(vmdk_copy_task) [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] return self.wait_for_task(task_ref) [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] return evt.wait() [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] result = hub.switch() [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] return self.greenlet.switch() [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1802.008755] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] self.f(*self.args, **self.kw) [ 1802.009243] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1802.009243] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] raise exceptions.translate_fault(task_info.error) [ 1802.009243] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1802.009243] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Faults: ['InvalidArgument'] [ 1802.009243] env[67964]: ERROR nova.compute.manager [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] [ 1802.009243] env[67964]: DEBUG nova.compute.utils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1802.010412] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Build of instance ee34b117-806d-4cc4-98b7-0f40f074cfab was re-scheduled: A specified parameter was not correct: fileType [ 1802.010412] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1802.010784] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1802.010952] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1802.011136] env[67964]: DEBUG nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1802.011299] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1802.302642] env[67964]: DEBUG nova.network.neutron [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1802.313847] env[67964]: INFO nova.compute.manager [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Took 0.30 seconds to deallocate network for instance. [ 1802.409303] env[67964]: INFO nova.scheduler.client.report [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Deleted allocations for instance ee34b117-806d-4cc4-98b7-0f40f074cfab [ 1802.428863] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de8cf929-59f8-44f4-894a-c916a432d6c9 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 606.457s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.429965] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 409.850s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1802.430241] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "ee34b117-806d-4cc4-98b7-0f40f074cfab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1802.430457] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1802.430621] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.432600] env[67964]: INFO nova.compute.manager [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Terminating instance [ 1802.434236] env[67964]: DEBUG nova.compute.manager [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1802.434427] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1802.434890] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7520c5c6-c530-406f-b6d3-8663f73a3340 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.441492] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1802.447553] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54fe13ea-1ca0-4383-af02-dba65bf0e1bf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.476044] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ee34b117-806d-4cc4-98b7-0f40f074cfab could not be found. [ 1802.476257] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1802.476430] env[67964]: INFO nova.compute.manager [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1802.476665] env[67964]: DEBUG oslo.service.loopingcall [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1802.478822] env[67964]: DEBUG nova.compute.manager [-] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1802.478911] env[67964]: DEBUG nova.network.neutron [-] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1802.492902] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1802.493147] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1802.494583] env[67964]: INFO nova.compute.claims [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1802.506182] env[67964]: DEBUG nova.network.neutron [-] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1802.520838] env[67964]: INFO nova.compute.manager [-] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] Took 0.04 seconds to deallocate network for instance. [ 1802.639968] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8b6d5c0b-57b8-4b9f-b206-442a0c933938 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.210s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.640974] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 28.916s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1802.641284] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ee34b117-806d-4cc4-98b7-0f40f074cfab] During sync_power_state the instance has a pending task (deleting). Skip. [ 1802.641573] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ee34b117-806d-4cc4-98b7-0f40f074cfab" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.726395] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2850af30-3c79-4807-bb1a-7d6d650f66e9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.734419] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc865fa7-7ca7-4df6-9eaf-f2207df47abd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.764279] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48955633-1dc1-4e5f-be4c-1d5b32792c75 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.771170] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b39bde46-6683-46b8-a3d7-26aa61dbf85b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.784543] env[67964]: DEBUG nova.compute.provider_tree [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1802.793276] env[67964]: DEBUG nova.scheduler.client.report [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1802.808504] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.808956] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1802.841211] env[67964]: DEBUG nova.compute.utils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1802.842911] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1802.843116] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1802.850978] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1802.905263] env[67964]: DEBUG nova.policy [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7f051b7129e94ac6b20334f348756b49', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b7f4b97c0ca4859964e6ea23310e9ce', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1802.912034] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1802.938165] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1802.938437] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1802.938599] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1802.938783] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1802.938928] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1802.939085] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1802.939293] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1802.939447] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1802.939609] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1802.939766] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1802.939931] env[67964]: DEBUG nova.virt.hardware [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1802.940892] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b59538f8-6e6e-4a28-97bb-010cf1d39a9b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.949149] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-155f6096-9950-4830-ba53-1bec014d516d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.291026] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Successfully created port: 01c3579f-39fd-49bb-a4c9-248b20848ce0 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1804.059091] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Successfully updated port: 01c3579f-39fd-49bb-a4c9-248b20848ce0 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1804.085711] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "refresh_cache-78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1804.085711] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired lock "refresh_cache-78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1804.085711] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1804.121998] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1804.343147] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Updating instance_info_cache with network_info: [{"id": "01c3579f-39fd-49bb-a4c9-248b20848ce0", "address": "fa:16:3e:01:4a:2e", "network": {"id": "aeaa87c8-704e-479e-9bca-e70f676fcf32", "bridge": "br-int", "label": "tempest-ServersTestJSON-644115068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b7f4b97c0ca4859964e6ea23310e9ce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01c3579f-39", "ovs_interfaceid": "01c3579f-39fd-49bb-a4c9-248b20848ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1804.361132] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Releasing lock "refresh_cache-78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1804.361460] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Instance network_info: |[{"id": "01c3579f-39fd-49bb-a4c9-248b20848ce0", "address": "fa:16:3e:01:4a:2e", "network": {"id": "aeaa87c8-704e-479e-9bca-e70f676fcf32", "bridge": "br-int", "label": "tempest-ServersTestJSON-644115068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b7f4b97c0ca4859964e6ea23310e9ce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01c3579f-39", "ovs_interfaceid": "01c3579f-39fd-49bb-a4c9-248b20848ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1804.361865] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:01:4a:2e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '01c3579f-39fd-49bb-a4c9-248b20848ce0', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1804.370031] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating folder: Project (2b7f4b97c0ca4859964e6ea23310e9ce). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1804.370604] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-01b28b32-d8c7-4780-a122-b16a0de6cb66 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.381317] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Created folder: Project (2b7f4b97c0ca4859964e6ea23310e9ce) in parent group-v690366. [ 1804.381513] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating folder: Instances. Parent ref: group-v690470. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1804.382731] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9d1d7335-a48e-4292-bc51-860f3c703f9d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.385408] env[67964]: DEBUG nova.compute.manager [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Received event network-vif-plugged-01c3579f-39fd-49bb-a4c9-248b20848ce0 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1804.385605] env[67964]: DEBUG oslo_concurrency.lockutils [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] Acquiring lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.385799] env[67964]: DEBUG oslo_concurrency.lockutils [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1804.385961] env[67964]: DEBUG oslo_concurrency.lockutils [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1804.386134] env[67964]: DEBUG nova.compute.manager [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] No waiting events found dispatching network-vif-plugged-01c3579f-39fd-49bb-a4c9-248b20848ce0 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1804.386297] env[67964]: WARNING nova.compute.manager [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Received unexpected event network-vif-plugged-01c3579f-39fd-49bb-a4c9-248b20848ce0 for instance with vm_state building and task_state spawning. [ 1804.386454] env[67964]: DEBUG nova.compute.manager [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Received event network-changed-01c3579f-39fd-49bb-a4c9-248b20848ce0 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1804.386603] env[67964]: DEBUG nova.compute.manager [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Refreshing instance network info cache due to event network-changed-01c3579f-39fd-49bb-a4c9-248b20848ce0. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1804.386776] env[67964]: DEBUG oslo_concurrency.lockutils [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] Acquiring lock "refresh_cache-78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1804.386907] env[67964]: DEBUG oslo_concurrency.lockutils [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] Acquired lock "refresh_cache-78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1804.387069] env[67964]: DEBUG nova.network.neutron [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Refreshing network info cache for port 01c3579f-39fd-49bb-a4c9-248b20848ce0 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1804.395238] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Created folder: Instances in parent group-v690470. [ 1804.395458] env[67964]: DEBUG oslo.service.loopingcall [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1804.396161] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1804.396796] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-79b586bb-e6fe-49bf-9ad0-03940ee3f92c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.419300] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1804.419300] env[67964]: value = "task-3456870" [ 1804.419300] env[67964]: _type = "Task" [ 1804.419300] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1804.426877] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456870, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1804.654718] env[67964]: DEBUG nova.network.neutron [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Updated VIF entry in instance network info cache for port 01c3579f-39fd-49bb-a4c9-248b20848ce0. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1804.655209] env[67964]: DEBUG nova.network.neutron [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Updating instance_info_cache with network_info: [{"id": "01c3579f-39fd-49bb-a4c9-248b20848ce0", "address": "fa:16:3e:01:4a:2e", "network": {"id": "aeaa87c8-704e-479e-9bca-e70f676fcf32", "bridge": "br-int", "label": "tempest-ServersTestJSON-644115068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b7f4b97c0ca4859964e6ea23310e9ce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01c3579f-39", "ovs_interfaceid": "01c3579f-39fd-49bb-a4c9-248b20848ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1804.665140] env[67964]: DEBUG oslo_concurrency.lockutils [req-635348b0-7057-4ea0-af1a-7438cc82cc97 req-e833245f-3300-4620-b106-cc1c945823d2 service nova] Releasing lock "refresh_cache-78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1804.928204] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456870, 'name': CreateVM_Task, 'duration_secs': 0.317581} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1804.928386] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1804.929029] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1804.929196] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1804.929503] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1804.929752] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c2d347c3-f539-40a2-9d8e-8f48a2918d1a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.934097] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for the task: (returnval){ [ 1804.934097] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ed3640-87cc-f7e2-35ff-2365e84f5890" [ 1804.934097] env[67964]: _type = "Task" [ 1804.934097] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1804.941277] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ed3640-87cc-f7e2-35ff-2365e84f5890, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1805.443911] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1805.444403] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1805.444403] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1806.800083] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1806.812788] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1806.813016] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.813234] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.813407] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1806.814530] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0d18472-1d5a-40b8-ad8d-d1221b89b4ed {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.823442] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef4d63f9-d797-40d4-ae09-5a148a36542c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.837182] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e3c025a-cd4f-485a-bf48-09d5715a526f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.843270] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6317a44b-61d1-4f29-9f8f-e6ee5960de95 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.873064] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180910MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1806.873229] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1806.873400] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.945908] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 7825ba9e-8603-4211-b5fe-708276272464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946084] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946241] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946369] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946498] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946604] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946715] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946827] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.946938] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.947059] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1806.957822] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1806.968403] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1806.968618] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1806.968770] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1806.984285] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing inventories for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:818}} [ 1806.998034] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating ProviderTree inventory for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:782}} [ 1806.998221] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1807.008832] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing aggregate associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, aggregates: None {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:827}} [ 1807.025980] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing trait associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:839}} [ 1807.167007] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492a4a89-4785-4132-9324-e32e8d1c57c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.174792] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e12689f6-c394-49c3-bee3-35d31854757c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.204977] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14cb32d3-da94-4df0-892e-bc76ff833fc5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.211753] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64ab32ee-9667-40f9-9885-ceef5cb9f2c9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.224576] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1807.232586] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1807.245463] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1807.245638] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1810.246031] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1810.246318] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1810.297178] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1810.297415] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1813.801268] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1813.801268] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1813.801810] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1813.801810] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1814.796555] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1816.802068] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1816.802068] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1816.802068] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1816.823542] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.823691] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.823774] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.823897] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.824028] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.824148] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.824271] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.824381] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.824502] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.824645] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1816.824763] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1816.825222] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1822.508157] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "c01bc11b-384e-418e-be43-e12d0a845a24" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1844.592898] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1851.803058] env[67964]: WARNING oslo_vmware.rw_handles [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1851.803058] env[67964]: ERROR oslo_vmware.rw_handles [ 1851.803058] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1851.804996] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1851.805273] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Copying Virtual Disk [datastore1] vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/4f5c91a2-2fc6-4502-89e4-c17a21a4b97c/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1851.805565] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4e18ee98-e46b-4ea6-bbee-ed175c816d88 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.813940] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 1851.813940] env[67964]: value = "task-3456871" [ 1851.813940] env[67964]: _type = "Task" [ 1851.813940] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1851.821616] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': task-3456871, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.324480] env[67964]: DEBUG oslo_vmware.exceptions [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1852.324831] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1852.325423] env[67964]: ERROR nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1852.325423] env[67964]: Faults: ['InvalidArgument'] [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] Traceback (most recent call last): [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] yield resources [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self.driver.spawn(context, instance, image_meta, [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self._fetch_image_if_missing(context, vi) [ 1852.325423] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] image_cache(vi, tmp_image_ds_loc) [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] vm_util.copy_virtual_disk( [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] session._wait_for_task(vmdk_copy_task) [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] return self.wait_for_task(task_ref) [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] return evt.wait() [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] result = hub.switch() [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1852.325811] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] return self.greenlet.switch() [ 1852.326201] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1852.326201] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self.f(*self.args, **self.kw) [ 1852.326201] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1852.326201] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] raise exceptions.translate_fault(task_info.error) [ 1852.326201] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1852.326201] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] Faults: ['InvalidArgument'] [ 1852.326201] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] [ 1852.326201] env[67964]: INFO nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Terminating instance [ 1852.327294] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1852.327498] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1852.327723] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-00e1ab95-87fa-45de-8a56-2a073cd53717 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.331299] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1852.331508] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1852.332224] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcda94ba-0953-47ca-a1d5-58333d24e39d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.338709] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1852.338882] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8be404fa-40ea-4c8d-a59b-8206a8eab60e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.342210] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1852.342406] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1852.343359] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9109027c-2ffb-432e-84c4-c268558ebe67 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.347971] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Waiting for the task: (returnval){ [ 1852.347971] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52e663c6-1248-7200-9ea0-96c5570b6c1b" [ 1852.347971] env[67964]: _type = "Task" [ 1852.347971] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.354933] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52e663c6-1248-7200-9ea0-96c5570b6c1b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.406784] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1852.406989] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1852.407184] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Deleting the datastore file [datastore1] 7825ba9e-8603-4211-b5fe-708276272464 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1852.407441] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-46afda47-bf49-40fd-bed4-b3c4be1ed544 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.412958] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 1852.412958] env[67964]: value = "task-3456873" [ 1852.412958] env[67964]: _type = "Task" [ 1852.412958] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.420375] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': task-3456873, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.858751] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1852.859150] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Creating directory with path [datastore1] vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1852.859258] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-65a56530-1473-4bed-92fe-e55ed80d5bd4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.870381] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Created directory with path [datastore1] vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1852.870574] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Fetch image to [datastore1] vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1852.870742] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1852.871444] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42014468-b47e-49b1-b5f8-db144f022245 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.877811] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86be2955-733f-4454-a09a-9df83fcc47ed {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.886379] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c958d47d-83a6-4b44-9b7a-f586ee6e42ad {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.918341] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f35a812-0af1-4c63-8276-e423fadb4b47 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.925065] env[67964]: DEBUG oslo_vmware.api [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': task-3456873, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074193} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1852.926462] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1852.926652] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1852.926823] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1852.926995] env[67964]: INFO nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1852.928732] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d268a913-db27-4a89-92c0-a53c2e74fe4a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.930586] env[67964]: DEBUG nova.compute.claims [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1852.930752] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.930961] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1852.951888] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1853.118774] env[67964]: DEBUG oslo_vmware.rw_handles [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1853.176883] env[67964]: DEBUG oslo_vmware.rw_handles [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1853.177095] env[67964]: DEBUG oslo_vmware.rw_handles [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1853.185311] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eff487c-f627-49af-a927-4110567c6d9a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.193798] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bf87223-d460-4136-a6f2-b7b78765740e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.222625] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4cbdad1-f8c5-4bd3-b676-723de97f2399 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.229116] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00cfe0ab-a9a4-4402-be16-26278a4b8d45 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.242083] env[67964]: DEBUG nova.compute.provider_tree [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1853.250338] env[67964]: DEBUG nova.scheduler.client.report [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1853.270506] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.339s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.271095] env[67964]: ERROR nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1853.271095] env[67964]: Faults: ['InvalidArgument'] [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] Traceback (most recent call last): [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self.driver.spawn(context, instance, image_meta, [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self._fetch_image_if_missing(context, vi) [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] image_cache(vi, tmp_image_ds_loc) [ 1853.271095] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] vm_util.copy_virtual_disk( [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] session._wait_for_task(vmdk_copy_task) [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] return self.wait_for_task(task_ref) [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] return evt.wait() [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] result = hub.switch() [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] return self.greenlet.switch() [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1853.271463] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] self.f(*self.args, **self.kw) [ 1853.271816] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1853.271816] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] raise exceptions.translate_fault(task_info.error) [ 1853.271816] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1853.271816] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] Faults: ['InvalidArgument'] [ 1853.271816] env[67964]: ERROR nova.compute.manager [instance: 7825ba9e-8603-4211-b5fe-708276272464] [ 1853.271816] env[67964]: DEBUG nova.compute.utils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1853.273280] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Build of instance 7825ba9e-8603-4211-b5fe-708276272464 was re-scheduled: A specified parameter was not correct: fileType [ 1853.273280] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1853.273668] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1853.273849] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1853.274032] env[67964]: DEBUG nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1853.274198] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1853.559061] env[67964]: DEBUG nova.network.neutron [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1853.572926] env[67964]: INFO nova.compute.manager [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Took 0.30 seconds to deallocate network for instance. [ 1853.670641] env[67964]: INFO nova.scheduler.client.report [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Deleted allocations for instance 7825ba9e-8603-4211-b5fe-708276272464 [ 1853.695438] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e522664-35e6-48a7-819e-71804d89dc8d tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "7825ba9e-8603-4211-b5fe-708276272464" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 655.300s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.697054] env[67964]: DEBUG oslo_concurrency.lockutils [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "7825ba9e-8603-4211-b5fe-708276272464" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 459.560s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.697302] env[67964]: DEBUG oslo_concurrency.lockutils [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "7825ba9e-8603-4211-b5fe-708276272464-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1853.697515] env[67964]: DEBUG oslo_concurrency.lockutils [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "7825ba9e-8603-4211-b5fe-708276272464-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.697688] env[67964]: DEBUG oslo_concurrency.lockutils [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "7825ba9e-8603-4211-b5fe-708276272464-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.699751] env[67964]: INFO nova.compute.manager [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Terminating instance [ 1853.702827] env[67964]: DEBUG nova.compute.manager [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1853.703036] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1853.703518] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-32c9fbd6-76dc-4635-821f-3aec7c4fb249 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.707157] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1853.713675] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfe57916-8690-435e-a8c2-ca0262f620eb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.742783] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7825ba9e-8603-4211-b5fe-708276272464 could not be found. [ 1853.742995] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1853.743223] env[67964]: INFO nova.compute.manager [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1853.743488] env[67964]: DEBUG oslo.service.loopingcall [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1853.744539] env[67964]: DEBUG nova.compute.manager [-] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1853.745085] env[67964]: DEBUG nova.network.neutron [-] [instance: 7825ba9e-8603-4211-b5fe-708276272464] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1853.760055] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1853.760336] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.761777] env[67964]: INFO nova.compute.claims [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1853.774665] env[67964]: DEBUG nova.network.neutron [-] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1853.790516] env[67964]: INFO nova.compute.manager [-] [instance: 7825ba9e-8603-4211-b5fe-708276272464] Took 0.05 seconds to deallocate network for instance. [ 1853.871022] env[67964]: DEBUG oslo_concurrency.lockutils [None req-07a1b448-51df-494f-bb5c-9c707c4e0e04 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "7825ba9e-8603-4211-b5fe-708276272464" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.871963] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "7825ba9e-8603-4211-b5fe-708276272464" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 80.146s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.872107] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 7825ba9e-8603-4211-b5fe-708276272464] During sync_power_state the instance has a pending task (deleting). Skip. [ 1853.872251] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "7825ba9e-8603-4211-b5fe-708276272464" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.969781] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44680727-5288-494c-8f02-398d09d0eda8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.977207] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95f6abb-adac-4af6-9aef-57f421d0c56d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.007917] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7d972d9-a003-4f4b-9c9d-2fcab8706d89 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.015031] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4aa19ca5-2a1c-49f9-90d6-200f0a14a7d1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.028102] env[67964]: DEBUG nova.compute.provider_tree [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1854.038377] env[67964]: DEBUG nova.scheduler.client.report [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1854.054493] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1854.054970] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1854.092039] env[67964]: DEBUG nova.compute.utils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1854.093346] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1854.093537] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1854.102369] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1854.157885] env[67964]: DEBUG nova.policy [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'daaca12089eb4485b5607a9d577f33b2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '83336cd0155c4286b66ac327ef1385b5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1854.165513] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1854.190605] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1854.190861] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1854.191027] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1854.191495] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1854.191495] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1854.191495] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1854.191688] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1854.191923] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1854.192133] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1854.192304] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1854.192502] env[67964]: DEBUG nova.virt.hardware [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1854.193356] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80e31c2f-52d9-4204-86b1-2f1721e5f3fb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.201112] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ffe4444-4e9f-4c1e-bbf2-8c65151a6bd8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.500404] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Successfully created port: 7437c5e2-7678-42aa-86c8-dbdee9397e57 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1855.138397] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Successfully updated port: 7437c5e2-7678-42aa-86c8-dbdee9397e57 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1855.156142] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "refresh_cache-07489f39-f57c-4528-80b8-b42056181b8b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.156298] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "refresh_cache-07489f39-f57c-4528-80b8-b42056181b8b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.156450] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1855.215331] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1855.384198] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Updating instance_info_cache with network_info: [{"id": "7437c5e2-7678-42aa-86c8-dbdee9397e57", "address": "fa:16:3e:09:6d:65", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7437c5e2-76", "ovs_interfaceid": "7437c5e2-7678-42aa-86c8-dbdee9397e57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1855.394850] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "refresh_cache-07489f39-f57c-4528-80b8-b42056181b8b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1855.395150] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Instance network_info: |[{"id": "7437c5e2-7678-42aa-86c8-dbdee9397e57", "address": "fa:16:3e:09:6d:65", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7437c5e2-76", "ovs_interfaceid": "7437c5e2-7678-42aa-86c8-dbdee9397e57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1855.395531] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:09:6d:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '50886eea-591a-452c-a27b-5f22cfc9df85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7437c5e2-7678-42aa-86c8-dbdee9397e57', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1855.403317] env[67964]: DEBUG oslo.service.loopingcall [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1855.403798] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1855.404031] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bb9ca3cc-13f8-4d5b-adf1-cfbb8d7fdf47 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.425086] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1855.425086] env[67964]: value = "task-3456874" [ 1855.425086] env[67964]: _type = "Task" [ 1855.425086] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.433132] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456874, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.593546] env[67964]: DEBUG nova.compute.manager [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Received event network-vif-plugged-7437c5e2-7678-42aa-86c8-dbdee9397e57 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1855.593791] env[67964]: DEBUG oslo_concurrency.lockutils [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] Acquiring lock "07489f39-f57c-4528-80b8-b42056181b8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.593990] env[67964]: DEBUG oslo_concurrency.lockutils [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] Lock "07489f39-f57c-4528-80b8-b42056181b8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1855.594177] env[67964]: DEBUG oslo_concurrency.lockutils [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] Lock "07489f39-f57c-4528-80b8-b42056181b8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1855.594339] env[67964]: DEBUG nova.compute.manager [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] No waiting events found dispatching network-vif-plugged-7437c5e2-7678-42aa-86c8-dbdee9397e57 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1855.594500] env[67964]: WARNING nova.compute.manager [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Received unexpected event network-vif-plugged-7437c5e2-7678-42aa-86c8-dbdee9397e57 for instance with vm_state building and task_state spawning. [ 1855.594658] env[67964]: DEBUG nova.compute.manager [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Received event network-changed-7437c5e2-7678-42aa-86c8-dbdee9397e57 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1855.594895] env[67964]: DEBUG nova.compute.manager [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Refreshing instance network info cache due to event network-changed-7437c5e2-7678-42aa-86c8-dbdee9397e57. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1855.595211] env[67964]: DEBUG oslo_concurrency.lockutils [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] Acquiring lock "refresh_cache-07489f39-f57c-4528-80b8-b42056181b8b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.595471] env[67964]: DEBUG oslo_concurrency.lockutils [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] Acquired lock "refresh_cache-07489f39-f57c-4528-80b8-b42056181b8b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.595741] env[67964]: DEBUG nova.network.neutron [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Refreshing network info cache for port 7437c5e2-7678-42aa-86c8-dbdee9397e57 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1855.857153] env[67964]: DEBUG nova.network.neutron [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Updated VIF entry in instance network info cache for port 7437c5e2-7678-42aa-86c8-dbdee9397e57. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1855.857568] env[67964]: DEBUG nova.network.neutron [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Updating instance_info_cache with network_info: [{"id": "7437c5e2-7678-42aa-86c8-dbdee9397e57", "address": "fa:16:3e:09:6d:65", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7437c5e2-76", "ovs_interfaceid": "7437c5e2-7678-42aa-86c8-dbdee9397e57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1855.866703] env[67964]: DEBUG oslo_concurrency.lockutils [req-529d2c71-195d-4780-9c1b-abb60feb3d21 req-88914073-9276-4c05-9f2f-340423546764 service nova] Releasing lock "refresh_cache-07489f39-f57c-4528-80b8-b42056181b8b" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1855.935190] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456874, 'name': CreateVM_Task, 'duration_secs': 0.275596} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1855.935366] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1855.941498] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.941664] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.941966] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1855.942241] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e09a56a1-a780-4b42-8a5b-77755a49879d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.946269] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 1855.946269] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52476139-f5dc-aeac-2ce2-e701e11c32ad" [ 1855.946269] env[67964]: _type = "Task" [ 1855.946269] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.953373] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52476139-f5dc-aeac-2ce2-e701e11c32ad, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1856.457021] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1856.457318] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1856.457580] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1866.800796] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1866.812664] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1866.812892] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1866.813071] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1866.813229] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1866.814318] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48e7d04b-ce9c-4604-8145-4f042fb657f4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.822720] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0583e657-e64c-46f9-b5b9-c0744be2ddcb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.836481] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b87fce0a-d66e-4cf0-9514-fba0b0525b11 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.842426] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70cb5af0-34db-4303-b2fa-1eb2cf4eaeee {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.872276] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180926MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1866.872437] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1866.872617] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1866.943429] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ec783231-6f62-4177-ba76-4ba688dda077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.943606] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.943703] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.943817] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.943937] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.944064] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.944182] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.944293] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.944404] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.944515] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1866.954678] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1866.964090] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1866.964298] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1866.964438] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1867.105988] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e51d773-c822-485f-ac69-03cb99d468c6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.111818] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba0e8513-4000-4098-b200-37944b21409f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.140327] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d029eac-416a-4c0f-bc6a-403bb424271f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.146936] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f64d5718-5234-4cf9-b50a-6eae31f0c2ef {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.159443] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1867.167858] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1867.180195] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1867.180368] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.308s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1872.180639] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1872.180921] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1874.801459] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1875.795699] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1875.800478] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1875.800767] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1875.801060] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1876.801071] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1876.801071] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1876.801071] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1876.821375] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.821597] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.821719] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.821845] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.821964] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.822098] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.822217] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.822332] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.822446] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.822560] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1876.822687] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1876.823171] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1887.817743] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1900.623043] env[67964]: WARNING oslo_vmware.rw_handles [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1900.623043] env[67964]: ERROR oslo_vmware.rw_handles [ 1900.623910] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1900.625651] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1900.625935] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Copying Virtual Disk [datastore1] vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/1da13941-61cb-4683-8bbf-83818c551dab/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1900.626241] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-28d8cc9a-9b7a-4a24-8e49-d282c2cca7dc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1900.635040] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Waiting for the task: (returnval){ [ 1900.635040] env[67964]: value = "task-3456875" [ 1900.635040] env[67964]: _type = "Task" [ 1900.635040] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1900.642928] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Task: {'id': task-3456875, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1901.145900] env[67964]: DEBUG oslo_vmware.exceptions [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1901.146031] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1901.146556] env[67964]: ERROR nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1901.146556] env[67964]: Faults: ['InvalidArgument'] [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] Traceback (most recent call last): [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] yield resources [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self.driver.spawn(context, instance, image_meta, [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self._fetch_image_if_missing(context, vi) [ 1901.146556] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] image_cache(vi, tmp_image_ds_loc) [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] vm_util.copy_virtual_disk( [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] session._wait_for_task(vmdk_copy_task) [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] return self.wait_for_task(task_ref) [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] return evt.wait() [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] result = hub.switch() [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1901.147111] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] return self.greenlet.switch() [ 1901.147551] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1901.147551] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self.f(*self.args, **self.kw) [ 1901.147551] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1901.147551] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] raise exceptions.translate_fault(task_info.error) [ 1901.147551] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1901.147551] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] Faults: ['InvalidArgument'] [ 1901.147551] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] [ 1901.147551] env[67964]: INFO nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Terminating instance [ 1901.148416] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1901.148623] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1901.148857] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7fda4800-7e42-4728-8d67-a045105c7b96 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.151019] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1901.151220] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1901.151926] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f489ab0-fcc6-40b1-be9f-ea5c58214799 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.158882] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1901.159922] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-43d3ffa9-fb2b-4b1e-878c-a7c9a266bd2a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.161286] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1901.161450] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1901.162107] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-58707469-a4e7-42b2-ab89-47ab8e2fdc34 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.166825] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Waiting for the task: (returnval){ [ 1901.166825] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5223e7b4-3bad-acb1-dd16-83987af0fd5f" [ 1901.166825] env[67964]: _type = "Task" [ 1901.166825] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1901.174091] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5223e7b4-3bad-acb1-dd16-83987af0fd5f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1901.235265] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1901.235429] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1901.235593] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Deleting the datastore file [datastore1] ec783231-6f62-4177-ba76-4ba688dda077 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1901.235847] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-99b32123-d988-4622-87af-a23901ceed20 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.243951] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Waiting for the task: (returnval){ [ 1901.243951] env[67964]: value = "task-3456877" [ 1901.243951] env[67964]: _type = "Task" [ 1901.243951] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1901.251269] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Task: {'id': task-3456877, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1901.677409] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1901.677682] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Creating directory with path [datastore1] vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1901.677924] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c00eed73-fc29-4318-9f44-2f8bdb1920c4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.689446] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Created directory with path [datastore1] vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1901.689626] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Fetch image to [datastore1] vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1901.689794] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1901.690573] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19e8f537-9474-4842-95ae-4d4ac6c7aa85 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.697196] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0856494-aa52-4e0c-8ece-fb11ad9cf495 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.706327] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5efe5f35-0938-4820-a177-4eeec1ff9121 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.738028] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fb43bf8-086b-4693-bb92-4d1037691b40 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.742828] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f879ad30-53da-47bf-9146-d8857762bd53 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.751465] env[67964]: DEBUG oslo_vmware.api [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Task: {'id': task-3456877, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077129} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1901.751701] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1901.751915] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1901.752054] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1901.752235] env[67964]: INFO nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1901.754286] env[67964]: DEBUG nova.compute.claims [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1901.754458] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1901.754666] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1901.767354] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1901.821300] env[67964]: DEBUG oslo_vmware.rw_handles [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1901.880620] env[67964]: DEBUG oslo_vmware.rw_handles [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1901.880809] env[67964]: DEBUG oslo_vmware.rw_handles [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1901.988138] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9d5600d-b083-403d-b2e6-710765b1896d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.996248] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f48bbde8-f768-4015-ac41-1316d75f918b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.024995] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c82f38-f12f-4f0e-992c-ee6e46e5e14f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.031947] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b1cdae9-de7a-47de-b2a9-6dce645ee0e0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.045693] env[67964]: DEBUG nova.compute.provider_tree [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1902.054329] env[67964]: DEBUG nova.scheduler.client.report [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1902.072205] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.317s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.072816] env[67964]: ERROR nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1902.072816] env[67964]: Faults: ['InvalidArgument'] [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] Traceback (most recent call last): [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self.driver.spawn(context, instance, image_meta, [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self._fetch_image_if_missing(context, vi) [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] image_cache(vi, tmp_image_ds_loc) [ 1902.072816] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] vm_util.copy_virtual_disk( [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] session._wait_for_task(vmdk_copy_task) [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] return self.wait_for_task(task_ref) [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] return evt.wait() [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] result = hub.switch() [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] return self.greenlet.switch() [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1902.073283] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] self.f(*self.args, **self.kw) [ 1902.073786] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1902.073786] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] raise exceptions.translate_fault(task_info.error) [ 1902.073786] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1902.073786] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] Faults: ['InvalidArgument'] [ 1902.073786] env[67964]: ERROR nova.compute.manager [instance: ec783231-6f62-4177-ba76-4ba688dda077] [ 1902.073786] env[67964]: DEBUG nova.compute.utils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1902.075177] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Build of instance ec783231-6f62-4177-ba76-4ba688dda077 was re-scheduled: A specified parameter was not correct: fileType [ 1902.075177] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1902.075519] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1902.075693] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1902.075862] env[67964]: DEBUG nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1902.076036] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1902.461534] env[67964]: DEBUG nova.network.neutron [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1902.471352] env[67964]: INFO nova.compute.manager [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Took 0.40 seconds to deallocate network for instance. [ 1902.567265] env[67964]: INFO nova.scheduler.client.report [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Deleted allocations for instance ec783231-6f62-4177-ba76-4ba688dda077 [ 1902.590097] env[67964]: DEBUG oslo_concurrency.lockutils [None req-6ac28269-1300-461d-9dc5-529104cae625 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "ec783231-6f62-4177-ba76-4ba688dda077" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 669.296s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.591017] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "ec783231-6f62-4177-ba76-4ba688dda077" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 473.140s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.591260] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Acquiring lock "ec783231-6f62-4177-ba76-4ba688dda077-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1902.591474] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "ec783231-6f62-4177-ba76-4ba688dda077-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.591640] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "ec783231-6f62-4177-ba76-4ba688dda077-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.594696] env[67964]: INFO nova.compute.manager [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Terminating instance [ 1902.596431] env[67964]: DEBUG nova.compute.manager [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1902.596624] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1902.596873] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6c7ac52d-ffc6-49f2-96e2-df556c094f2f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.606641] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16cb0013-b561-43bc-8b34-5ab817cf951d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.618517] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1902.638985] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ec783231-6f62-4177-ba76-4ba688dda077 could not be found. [ 1902.639199] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1902.639375] env[67964]: INFO nova.compute.manager [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1902.639609] env[67964]: DEBUG oslo.service.loopingcall [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1902.639838] env[67964]: DEBUG nova.compute.manager [-] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1902.639932] env[67964]: DEBUG nova.network.neutron [-] [instance: ec783231-6f62-4177-ba76-4ba688dda077] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1902.664051] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1902.664301] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.665738] env[67964]: INFO nova.compute.claims [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1902.671903] env[67964]: DEBUG nova.network.neutron [-] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1902.685223] env[67964]: INFO nova.compute.manager [-] [instance: ec783231-6f62-4177-ba76-4ba688dda077] Took 0.05 seconds to deallocate network for instance. [ 1902.786200] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1e7f5111-7df7-4d0d-bc77-d8548545b7b2 tempest-InstanceActionsNegativeTestJSON-249389966 tempest-InstanceActionsNegativeTestJSON-249389966-project-member] Lock "ec783231-6f62-4177-ba76-4ba688dda077" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.787252] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ec783231-6f62-4177-ba76-4ba688dda077" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 129.062s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.787440] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ec783231-6f62-4177-ba76-4ba688dda077] During sync_power_state the instance has a pending task (deleting). Skip. [ 1902.787603] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ec783231-6f62-4177-ba76-4ba688dda077" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.858875] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a0ce972-3a44-4045-bb0b-d6d01fa6f7bd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.866808] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b432549-1808-463f-b67c-0210db9609c6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.895798] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52cf72ce-ffe4-48e0-91f8-eb5b6d4da804 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.902722] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63b1f137-6e2c-475d-a720-60e909f4c1fd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.915561] env[67964]: DEBUG nova.compute.provider_tree [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1902.924866] env[67964]: DEBUG nova.scheduler.client.report [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1902.937755] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.938234] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1902.969519] env[67964]: DEBUG nova.compute.utils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1902.970662] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1902.970875] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1902.980219] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1903.032115] env[67964]: DEBUG nova.policy [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4582636e1ee74b61878e4c1badbd563e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15502e37757142d4afa0577a3e80bfb8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1903.042244] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1903.066260] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1903.066497] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1903.066665] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1903.066851] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1903.066998] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1903.067203] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1903.067415] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1903.067586] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1903.067752] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1903.067924] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1903.068120] env[67964]: DEBUG nova.virt.hardware [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1903.068997] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7488c2f3-a806-48e9-9126-9ac61642aba7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.077486] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ebe8cfe-e14f-4903-9767-c7f3a4836d44 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.366024] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Successfully created port: ccb33a34-5c90-4698-b325-686dc3a4bd95 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1904.211494] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Successfully updated port: ccb33a34-5c90-4698-b325-686dc3a4bd95 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1904.224512] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "refresh_cache-3e0e0504-9c76-4201-baf8-2d9636981f0c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1904.224512] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "refresh_cache-3e0e0504-9c76-4201-baf8-2d9636981f0c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1904.224512] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1904.271390] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1904.431419] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Updating instance_info_cache with network_info: [{"id": "ccb33a34-5c90-4698-b325-686dc3a4bd95", "address": "fa:16:3e:ac:4b:df", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapccb33a34-5c", "ovs_interfaceid": "ccb33a34-5c90-4698-b325-686dc3a4bd95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.448181] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "refresh_cache-3e0e0504-9c76-4201-baf8-2d9636981f0c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1904.448550] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Instance network_info: |[{"id": "ccb33a34-5c90-4698-b325-686dc3a4bd95", "address": "fa:16:3e:ac:4b:df", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapccb33a34-5c", "ovs_interfaceid": "ccb33a34-5c90-4698-b325-686dc3a4bd95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1904.449201] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ac:4b:df', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b4d548e7-d762-406a-bb2d-dc7168a8ca67', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ccb33a34-5c90-4698-b325-686dc3a4bd95', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1904.457555] env[67964]: DEBUG oslo.service.loopingcall [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1904.458027] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1904.458269] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-88bac451-b5b5-463f-b4d0-be9392f38cf5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.481242] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1904.481242] env[67964]: value = "task-3456878" [ 1904.481242] env[67964]: _type = "Task" [ 1904.481242] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1904.488146] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456878, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1904.497660] env[67964]: DEBUG nova.compute.manager [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Received event network-vif-plugged-ccb33a34-5c90-4698-b325-686dc3a4bd95 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1904.497947] env[67964]: DEBUG oslo_concurrency.lockutils [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] Acquiring lock "3e0e0504-9c76-4201-baf8-2d9636981f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.498099] env[67964]: DEBUG oslo_concurrency.lockutils [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.498236] env[67964]: DEBUG oslo_concurrency.lockutils [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.498407] env[67964]: DEBUG nova.compute.manager [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] No waiting events found dispatching network-vif-plugged-ccb33a34-5c90-4698-b325-686dc3a4bd95 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1904.498555] env[67964]: WARNING nova.compute.manager [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Received unexpected event network-vif-plugged-ccb33a34-5c90-4698-b325-686dc3a4bd95 for instance with vm_state building and task_state spawning. [ 1904.498689] env[67964]: DEBUG nova.compute.manager [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Received event network-changed-ccb33a34-5c90-4698-b325-686dc3a4bd95 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1904.498833] env[67964]: DEBUG nova.compute.manager [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Refreshing instance network info cache due to event network-changed-ccb33a34-5c90-4698-b325-686dc3a4bd95. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1904.499036] env[67964]: DEBUG oslo_concurrency.lockutils [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] Acquiring lock "refresh_cache-3e0e0504-9c76-4201-baf8-2d9636981f0c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1904.499564] env[67964]: DEBUG oslo_concurrency.lockutils [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] Acquired lock "refresh_cache-3e0e0504-9c76-4201-baf8-2d9636981f0c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1904.499732] env[67964]: DEBUG nova.network.neutron [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Refreshing network info cache for port ccb33a34-5c90-4698-b325-686dc3a4bd95 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1904.791994] env[67964]: DEBUG nova.network.neutron [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Updated VIF entry in instance network info cache for port ccb33a34-5c90-4698-b325-686dc3a4bd95. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1904.792375] env[67964]: DEBUG nova.network.neutron [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Updating instance_info_cache with network_info: [{"id": "ccb33a34-5c90-4698-b325-686dc3a4bd95", "address": "fa:16:3e:ac:4b:df", "network": {"id": "35550b63-2fb8-405c-84f4-2ef94086947d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1240380541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "15502e37757142d4afa0577a3e80bfb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4d548e7-d762-406a-bb2d-dc7168a8ca67", "external-id": "nsx-vlan-transportzone-796", "segmentation_id": 796, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapccb33a34-5c", "ovs_interfaceid": "ccb33a34-5c90-4698-b325-686dc3a4bd95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.801450] env[67964]: DEBUG oslo_concurrency.lockutils [req-d7370325-cb59-48f8-a11d-f8774dda59c2 req-4efc35a6-f837-476a-abe0-e01677db3ae4 service nova] Releasing lock "refresh_cache-3e0e0504-9c76-4201-baf8-2d9636981f0c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1904.989474] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456878, 'name': CreateVM_Task, 'duration_secs': 0.301417} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1904.989645] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1904.990324] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1904.990493] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1904.990801] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1904.991063] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2ab8fb57-9f53-4364-a87c-fddaeacf4a61 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.995562] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 1904.995562] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52c09324-61bf-9c4e-aac8-c513ef848929" [ 1904.995562] env[67964]: _type = "Task" [ 1904.995562] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1905.002916] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52c09324-61bf-9c4e-aac8-c513ef848929, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1905.505869] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1905.506262] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1905.506369] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1926.803194] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1926.815014] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1926.815014] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.815158] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.815315] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1926.816451] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-922aef91-68ce-4d79-8b78-28387af3a731 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.825518] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-879d3bc7-4e79-4589-b22f-2b9c4e934966 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.841062] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec5d34fa-8be8-4062-92f6-6eab35eef387 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.846697] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfab0e74-324f-4b01-9837-d5b3a002e11f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.874602] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180909MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1926.874748] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1926.874924] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.944315] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance ea5f3d40-6494-459a-a917-2602d0718d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.944481] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.944608] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.944729] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.944848] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.944967] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.945097] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.945218] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.945330] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.945443] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1926.955673] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1926.955888] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1926.956047] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1927.078244] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ad7abc3-15c1-4fb4-9384-f6d89510e864 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1927.086660] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e4743aa-5ba5-4bd9-8404-068c67aa517e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1927.123073] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9003f731-bf2f-4829-81de-2081ec673c4c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1927.128903] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-379021f6-c6d3-45ae-8d95-c513ff45625d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1927.142162] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1927.151074] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1927.164267] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1927.164538] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.290s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1932.163967] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1932.163967] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1934.802539] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1935.795561] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1935.800196] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1935.800390] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1937.800377] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1938.801133] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1938.801408] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1938.801454] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1938.821489] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.821657] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.821788] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.821908] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.822046] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.822172] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.822293] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.822412] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.822531] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.822648] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1938.822766] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1938.823273] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1950.639850] env[67964]: WARNING oslo_vmware.rw_handles [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1950.639850] env[67964]: ERROR oslo_vmware.rw_handles [ 1950.640531] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1950.642265] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1950.642521] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Copying Virtual Disk [datastore1] vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/86c7e199-70d5-4b2c-a93c-b8edb12268ca/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1950.642819] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d7e07b6d-1ea2-4127-be25-b462e0e13ae2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.652027] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Waiting for the task: (returnval){ [ 1950.652027] env[67964]: value = "task-3456879" [ 1950.652027] env[67964]: _type = "Task" [ 1950.652027] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1950.660087] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Task: {'id': task-3456879, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.163382] env[67964]: DEBUG oslo_vmware.exceptions [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1951.163638] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1951.164193] env[67964]: ERROR nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1951.164193] env[67964]: Faults: ['InvalidArgument'] [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Traceback (most recent call last): [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] yield resources [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self.driver.spawn(context, instance, image_meta, [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self._fetch_image_if_missing(context, vi) [ 1951.164193] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] image_cache(vi, tmp_image_ds_loc) [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] vm_util.copy_virtual_disk( [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] session._wait_for_task(vmdk_copy_task) [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] return self.wait_for_task(task_ref) [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] return evt.wait() [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] result = hub.switch() [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1951.164658] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] return self.greenlet.switch() [ 1951.165124] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1951.165124] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self.f(*self.args, **self.kw) [ 1951.165124] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1951.165124] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] raise exceptions.translate_fault(task_info.error) [ 1951.165124] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1951.165124] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Faults: ['InvalidArgument'] [ 1951.165124] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] [ 1951.165124] env[67964]: INFO nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Terminating instance [ 1951.166092] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1951.166296] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1951.166537] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9a0f4e03-96ce-40fd-8af1-dd8262f87a38 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.168497] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1951.168658] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquired lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1951.168824] env[67964]: DEBUG nova.network.neutron [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1951.175475] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1951.175665] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1951.176812] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-94ebefa7-a277-42a0-b11e-d5f7bfa8afe1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.184054] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 1951.184054] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52384f45-050e-521d-77dd-4fe4c8432d49" [ 1951.184054] env[67964]: _type = "Task" [ 1951.184054] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1951.191156] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52384f45-050e-521d-77dd-4fe4c8432d49, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.197142] env[67964]: DEBUG nova.network.neutron [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1951.256546] env[67964]: DEBUG nova.network.neutron [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1951.265395] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Releasing lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1951.265810] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1951.266011] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1951.267056] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f0b38b8-6f3b-4e72-9ee0-a0f37ebde424 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.274804] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1951.275039] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-83dfdbf9-897a-4343-8344-affbe61533f2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.302584] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1951.302789] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1951.302964] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Deleting the datastore file [datastore1] ea5f3d40-6494-459a-a917-2602d0718d8c {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1951.303530] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-66f44a1a-80d9-4b71-acb3-b4bac5055b4c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.310077] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Waiting for the task: (returnval){ [ 1951.310077] env[67964]: value = "task-3456881" [ 1951.310077] env[67964]: _type = "Task" [ 1951.310077] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1951.318787] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Task: {'id': task-3456881, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.694464] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1951.694813] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating directory with path [datastore1] vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1951.694940] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f312fc7-ca1e-4808-9078-4b4b34047b19 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.707544] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Created directory with path [datastore1] vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1951.707744] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Fetch image to [datastore1] vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1951.708152] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1951.708755] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae844b1-9d20-4ba5-8fcb-2c506e6acd6b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.716156] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4472e1cf-cce4-4c4d-8f53-c47891466f1e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.725581] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad57d032-0548-446e-9cd3-ea618096cc50 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.764397] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd6aaae1-299f-4bf2-af1f-949cbaaaa7b1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.770756] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a742906d-58c3-4a6d-bd07-9a226e8ad6d0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.791618] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1951.821059] env[67964]: DEBUG oslo_vmware.api [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Task: {'id': task-3456881, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.043007} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1951.824452] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1951.824641] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1951.824818] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1951.824984] env[67964]: INFO nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1951.825244] env[67964]: DEBUG oslo.service.loopingcall [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1951.825649] env[67964]: DEBUG nova.compute.manager [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network deallocation for instance since networking was not requested. {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1951.829024] env[67964]: DEBUG nova.compute.claims [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1951.829024] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1951.829024] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1951.853933] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1951.914258] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1951.914435] env[67964]: DEBUG oslo_vmware.rw_handles [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1952.034315] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1693549-a058-41dd-836f-3ccdb76ce26e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.041408] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-073693fb-0bf7-4291-9202-4a6cb9021686 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.071453] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b674d6f1-3410-4d52-a59d-ae2272737269 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.078344] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce8fd488-9f45-4416-bd5a-b6e1f806e2db {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.090859] env[67964]: DEBUG nova.compute.provider_tree [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1952.098980] env[67964]: DEBUG nova.scheduler.client.report [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1952.112023] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.284s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.112552] env[67964]: ERROR nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.112552] env[67964]: Faults: ['InvalidArgument'] [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Traceback (most recent call last): [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self.driver.spawn(context, instance, image_meta, [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self._fetch_image_if_missing(context, vi) [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] image_cache(vi, tmp_image_ds_loc) [ 1952.112552] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] vm_util.copy_virtual_disk( [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] session._wait_for_task(vmdk_copy_task) [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] return self.wait_for_task(task_ref) [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] return evt.wait() [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] result = hub.switch() [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] return self.greenlet.switch() [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1952.112926] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] self.f(*self.args, **self.kw) [ 1952.113373] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1952.113373] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] raise exceptions.translate_fault(task_info.error) [ 1952.113373] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.113373] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Faults: ['InvalidArgument'] [ 1952.113373] env[67964]: ERROR nova.compute.manager [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] [ 1952.113373] env[67964]: DEBUG nova.compute.utils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1952.114630] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Build of instance ea5f3d40-6494-459a-a917-2602d0718d8c was re-scheduled: A specified parameter was not correct: fileType [ 1952.114630] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1952.114996] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1952.115233] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1952.115382] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquired lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1952.115533] env[67964]: DEBUG nova.network.neutron [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1952.138577] env[67964]: DEBUG nova.network.neutron [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1952.196604] env[67964]: DEBUG nova.network.neutron [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1952.205129] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Releasing lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1952.205356] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1952.205592] env[67964]: DEBUG nova.compute.manager [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Skipping network deallocation for instance since networking was not requested. {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1952.309880] env[67964]: INFO nova.scheduler.client.report [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Deleted allocations for instance ea5f3d40-6494-459a-a917-2602d0718d8c [ 1952.350047] env[67964]: DEBUG oslo_concurrency.lockutils [None req-29294e10-7a7c-4a13-9dea-ae6493ee3cf3 tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 572.229s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.351139] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 375.829s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.351579] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "ea5f3d40-6494-459a-a917-2602d0718d8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1952.351732] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.351905] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.353961] env[67964]: INFO nova.compute.manager [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Terminating instance [ 1952.355627] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquiring lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1952.355781] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Acquired lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1952.355948] env[67964]: DEBUG nova.network.neutron [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1952.364142] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1952.383416] env[67964]: DEBUG nova.network.neutron [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1952.408847] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1952.409114] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.410830] env[67964]: INFO nova.compute.claims [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1952.578661] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7d79665-4b44-49cc-89dc-dbc596714a69 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.587825] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4b0c1da-70f1-45e3-8c98-ce4412acbc98 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.617379] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fb50a1d-0992-468a-9c30-c70e81314586 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.624614] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d47826e7-f6b8-4979-9db8-5d1519b03578 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.637245] env[67964]: DEBUG nova.compute.provider_tree [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1952.645759] env[67964]: DEBUG nova.scheduler.client.report [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1952.658361] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.658809] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1952.675750] env[67964]: DEBUG nova.network.neutron [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1952.683779] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Releasing lock "refresh_cache-ea5f3d40-6494-459a-a917-2602d0718d8c" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1952.684118] env[67964]: DEBUG nova.compute.manager [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1952.684403] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1952.684954] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-46942f3b-fb32-459c-8b98-8b2e7abd04c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.691132] env[67964]: DEBUG nova.compute.utils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1952.693179] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1952.693397] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1952.698917] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a639417b-38b5-4d1e-bcc7-1e9ea2095934 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.710296] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1952.731187] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ea5f3d40-6494-459a-a917-2602d0718d8c could not be found. [ 1952.731422] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1952.731568] env[67964]: INFO nova.compute.manager [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1952.731796] env[67964]: DEBUG oslo.service.loopingcall [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1952.732033] env[67964]: DEBUG nova.compute.manager [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1952.732137] env[67964]: DEBUG nova.network.neutron [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1952.756806] env[67964]: DEBUG nova.network.neutron [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1952.763404] env[67964]: DEBUG nova.network.neutron [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1952.774110] env[67964]: DEBUG nova.policy [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7373f7b862cc4f43a074101da040ac07', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30050a5e509146ea87e6a86263ba0f59', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1952.777007] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1952.780254] env[67964]: INFO nova.compute.manager [-] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] Took 0.05 seconds to deallocate network for instance. [ 1952.798308] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1952.798550] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1952.798772] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1952.798962] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1952.799122] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1952.799267] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1952.799469] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1952.799622] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1952.799780] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1952.799932] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1952.800178] env[67964]: DEBUG nova.virt.hardware [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1952.800941] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be76cf7-11c7-496e-a51e-036ff821b71a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.809108] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d8b4472-3adb-4e17-93ef-74cf27204617 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.873419] env[67964]: DEBUG oslo_concurrency.lockutils [None req-0c458ead-6f07-473b-af88-3c392a1e5d5d tempest-ServerShowV254Test-1105176099 tempest-ServerShowV254Test-1105176099-project-member] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.522s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.874283] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 179.148s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.874473] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: ea5f3d40-6494-459a-a917-2602d0718d8c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1952.874642] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "ea5f3d40-6494-459a-a917-2602d0718d8c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.238193] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Successfully created port: 7d144b5c-c875-41e0-95e9-ab05a55e9bf9 {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1953.770183] env[67964]: DEBUG nova.compute.manager [req-e2ca346a-49a2-4136-bb4c-4fc9580efdb9 req-e7e3a773-34ab-4e67-b603-39c06f7762ec service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Received event network-vif-plugged-7d144b5c-c875-41e0-95e9-ab05a55e9bf9 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1953.770430] env[67964]: DEBUG oslo_concurrency.lockutils [req-e2ca346a-49a2-4136-bb4c-4fc9580efdb9 req-e7e3a773-34ab-4e67-b603-39c06f7762ec service nova] Acquiring lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1953.770596] env[67964]: DEBUG oslo_concurrency.lockutils [req-e2ca346a-49a2-4136-bb4c-4fc9580efdb9 req-e7e3a773-34ab-4e67-b603-39c06f7762ec service nova] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1953.770760] env[67964]: DEBUG oslo_concurrency.lockutils [req-e2ca346a-49a2-4136-bb4c-4fc9580efdb9 req-e7e3a773-34ab-4e67-b603-39c06f7762ec service nova] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.772584] env[67964]: DEBUG nova.compute.manager [req-e2ca346a-49a2-4136-bb4c-4fc9580efdb9 req-e7e3a773-34ab-4e67-b603-39c06f7762ec service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] No waiting events found dispatching network-vif-plugged-7d144b5c-c875-41e0-95e9-ab05a55e9bf9 {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1953.772772] env[67964]: WARNING nova.compute.manager [req-e2ca346a-49a2-4136-bb4c-4fc9580efdb9 req-e7e3a773-34ab-4e67-b603-39c06f7762ec service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Received unexpected event network-vif-plugged-7d144b5c-c875-41e0-95e9-ab05a55e9bf9 for instance with vm_state building and task_state spawning. [ 1953.811683] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Successfully updated port: 7d144b5c-c875-41e0-95e9-ab05a55e9bf9 {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1953.823270] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "refresh_cache-aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1953.823588] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "refresh_cache-aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1953.823838] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1953.861225] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1954.057585] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Updating instance_info_cache with network_info: [{"id": "7d144b5c-c875-41e0-95e9-ab05a55e9bf9", "address": "fa:16:3e:19:11:1b", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d144b5c-c8", "ovs_interfaceid": "7d144b5c-c875-41e0-95e9-ab05a55e9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1954.068622] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "refresh_cache-aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1954.068915] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Instance network_info: |[{"id": "7d144b5c-c875-41e0-95e9-ab05a55e9bf9", "address": "fa:16:3e:19:11:1b", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d144b5c-c8", "ovs_interfaceid": "7d144b5c-c875-41e0-95e9-ab05a55e9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1954.069302] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:11:1b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b107fab-ee71-47db-ad4d-3c6f05546843', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7d144b5c-c875-41e0-95e9-ab05a55e9bf9', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1954.077168] env[67964]: DEBUG oslo.service.loopingcall [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1954.077683] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1954.077970] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-685b662f-ff2b-4aa5-8bf8-b1886c3b4673 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.099091] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1954.099091] env[67964]: value = "task-3456882" [ 1954.099091] env[67964]: _type = "Task" [ 1954.099091] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1954.109034] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456882, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1954.612779] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456882, 'name': CreateVM_Task, 'duration_secs': 0.284102} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1954.612961] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1954.613859] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1954.614121] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1954.614546] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1954.614883] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98d3c24f-d398-423b-a60b-908815095a6c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.619089] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 1954.619089] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]522e3bc2-051b-f93e-b767-f2f7d80dc7e9" [ 1954.619089] env[67964]: _type = "Task" [ 1954.619089] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1954.626015] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]522e3bc2-051b-f93e-b767-f2f7d80dc7e9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1955.130958] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1955.131298] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1955.131484] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1955.796530] env[67964]: DEBUG nova.compute.manager [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Received event network-changed-7d144b5c-c875-41e0-95e9-ab05a55e9bf9 {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1955.796723] env[67964]: DEBUG nova.compute.manager [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Refreshing instance network info cache due to event network-changed-7d144b5c-c875-41e0-95e9-ab05a55e9bf9. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1955.796935] env[67964]: DEBUG oslo_concurrency.lockutils [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] Acquiring lock "refresh_cache-aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1955.797092] env[67964]: DEBUG oslo_concurrency.lockutils [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] Acquired lock "refresh_cache-aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1955.797257] env[67964]: DEBUG nova.network.neutron [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Refreshing network info cache for port 7d144b5c-c875-41e0-95e9-ab05a55e9bf9 {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1956.099709] env[67964]: DEBUG nova.network.neutron [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Updated VIF entry in instance network info cache for port 7d144b5c-c875-41e0-95e9-ab05a55e9bf9. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1956.100070] env[67964]: DEBUG nova.network.neutron [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Updating instance_info_cache with network_info: [{"id": "7d144b5c-c875-41e0-95e9-ab05a55e9bf9", "address": "fa:16:3e:19:11:1b", "network": {"id": "4688491e-7bc1-42dc-b5f6-d988d578de92", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1770914470-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30050a5e509146ea87e6a86263ba0f59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b107fab-ee71-47db-ad4d-3c6f05546843", "external-id": "cl2-zone-554", "segmentation_id": 554, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d144b5c-c8", "ovs_interfaceid": "7d144b5c-c875-41e0-95e9-ab05a55e9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1956.108953] env[67964]: DEBUG oslo_concurrency.lockutils [req-060ae040-0a95-478e-98e9-17149deb6782 req-2f1d6f9f-cc98-40c0-8e51-161737d0ab13 service nova] Releasing lock "refresh_cache-aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1986.801619] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1986.813015] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1986.813265] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1986.813462] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1986.813621] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1986.814725] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff80d7e2-f73e-498d-987c-3e103b0dc53a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.823544] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c7d76f4-3897-41a8-a1b7-4eea89867b24 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.837255] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5db781ed-8ea7-470f-8d42-d9312f00229e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.843484] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b32f444b-f822-44f5-a433-c86e305449ee {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.873066] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180842MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1986.873208] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1986.873393] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1986.942249] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.942424] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.942552] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.942675] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.942793] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.942910] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.943036] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.943153] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.943267] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.943379] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.943625] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1986.943775] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1987.057071] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10c75274-50ea-44c9-9caa-8ec25291da4c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1987.064752] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f850357f-4225-444e-97b3-ae4782d98496 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1987.094286] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebec261c-fa57-4f16-b5f7-9562a040818d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1987.101651] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf18dca7-b86c-4e31-8c9e-2206fcaf5aa5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1987.115087] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1987.122948] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 1987.137707] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1987.137977] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1987.549027] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "07489f39-f57c-4528-80b8-b42056181b8b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1990.136815] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1994.137538] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1994.137847] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1995.796217] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1995.799891] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1997.800837] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1997.801092] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1998.800904] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1998.801291] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1998.801291] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1998.823601] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.823768] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.823904] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824040] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824165] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824282] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824402] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824518] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824647] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824769] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1998.824886] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1998.825379] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1999.799579] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2000.024573] env[67964]: WARNING oslo_vmware.rw_handles [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2000.024573] env[67964]: ERROR oslo_vmware.rw_handles [ 2000.025372] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2000.027662] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2000.027893] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Copying Virtual Disk [datastore1] vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/8aaf7b1f-4b5f-4f05-a9d1-054e467384eb/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2000.028198] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4036ce0c-ef13-4958-ba46-0b96eaee5e7f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.036702] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 2000.036702] env[67964]: value = "task-3456883" [ 2000.036702] env[67964]: _type = "Task" [ 2000.036702] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.044519] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': task-3456883, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.547956] env[67964]: DEBUG oslo_vmware.exceptions [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2000.548295] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2000.548845] env[67964]: ERROR nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2000.548845] env[67964]: Faults: ['InvalidArgument'] [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Traceback (most recent call last): [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] yield resources [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self.driver.spawn(context, instance, image_meta, [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self._fetch_image_if_missing(context, vi) [ 2000.548845] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] image_cache(vi, tmp_image_ds_loc) [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] vm_util.copy_virtual_disk( [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] session._wait_for_task(vmdk_copy_task) [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] return self.wait_for_task(task_ref) [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] return evt.wait() [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] result = hub.switch() [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2000.549301] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] return self.greenlet.switch() [ 2000.549755] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2000.549755] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self.f(*self.args, **self.kw) [ 2000.549755] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2000.549755] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] raise exceptions.translate_fault(task_info.error) [ 2000.549755] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2000.549755] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Faults: ['InvalidArgument'] [ 2000.549755] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] [ 2000.549755] env[67964]: INFO nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Terminating instance [ 2000.551048] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2000.551048] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2000.551274] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5f9f41c5-7da5-416c-97d9-31f25298dbfa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.553288] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2000.553471] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2000.554207] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a2695e6-2a34-4803-a991-033e987aef53 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.561290] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2000.561456] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0bc4b00e-a106-41c4-997f-ac0bf18645cd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.637570] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2000.637777] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2000.637953] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Deleting the datastore file [datastore1] da8f11e2-6d58-4e28-aabb-9943bc657e60 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2000.638225] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-563d6fbf-ce3b-49fc-b1bd-1f350acd1fd3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.645970] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 2000.645970] env[67964]: value = "task-3456885" [ 2000.645970] env[67964]: _type = "Task" [ 2000.645970] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.653173] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': task-3456885, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.814260] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2000.814440] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2000.815222] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-05e1937a-8e9c-4c40-a753-df1f480abafb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.820966] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 2000.820966] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52030dbc-ba80-b08c-0be4-ee7be0ed081d" [ 2000.820966] env[67964]: _type = "Task" [ 2000.820966] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.828549] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52030dbc-ba80-b08c-0be4-ee7be0ed081d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2001.155515] env[67964]: DEBUG oslo_vmware.api [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': task-3456885, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065403} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2001.155920] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2001.155920] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2001.156090] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2001.156215] env[67964]: INFO nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2001.158348] env[67964]: DEBUG nova.compute.claims [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2001.158516] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2001.158728] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.327963] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac572117-e37b-42c2-8413-f72f425dc3a4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.337581] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2001.337821] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating directory with path [datastore1] vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2001.338088] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-33663d46-4ea0-449b-8a1a-a12804a61b48 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.340335] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9156f580-6651-4a3b-90e8-77bdf8aeec0f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.370770] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa6a865f-fa1c-4f5e-8e70-59232a60388c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.373125] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created directory with path [datastore1] vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2001.373313] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Fetch image to [datastore1] vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2001.373479] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2001.374163] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a535c6-de66-4190-bdc4-1ce648608cb1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.382713] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21f1995f-4022-4bfb-b2a3-fd599cdae93f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.386680] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82ed8fb3-6e5a-4bc4-8321-ac92ac848b5d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.398226] env[67964]: DEBUG nova.compute.provider_tree [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2001.404281] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05659b85-8629-4006-876f-822c4236e5c0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.408993] env[67964]: DEBUG nova.scheduler.client.report [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2001.440200] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.281s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2001.440736] env[67964]: ERROR nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2001.440736] env[67964]: Faults: ['InvalidArgument'] [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Traceback (most recent call last): [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self.driver.spawn(context, instance, image_meta, [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self._fetch_image_if_missing(context, vi) [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] image_cache(vi, tmp_image_ds_loc) [ 2001.440736] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] vm_util.copy_virtual_disk( [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] session._wait_for_task(vmdk_copy_task) [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] return self.wait_for_task(task_ref) [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] return evt.wait() [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] result = hub.switch() [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] return self.greenlet.switch() [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2001.441131] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] self.f(*self.args, **self.kw) [ 2001.441513] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2001.441513] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] raise exceptions.translate_fault(task_info.error) [ 2001.441513] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2001.441513] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Faults: ['InvalidArgument'] [ 2001.441513] env[67964]: ERROR nova.compute.manager [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] [ 2001.441513] env[67964]: DEBUG nova.compute.utils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2001.442871] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceaf65fc-d994-41b6-adb3-b0446065ba98 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.445526] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Build of instance da8f11e2-6d58-4e28-aabb-9943bc657e60 was re-scheduled: A specified parameter was not correct: fileType [ 2001.445526] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2001.445906] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2001.446086] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2001.446257] env[67964]: DEBUG nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2001.446418] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2001.452059] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8c7f2bb0-1963-45f8-9553-50a41de310f8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.472724] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2001.520431] env[67964]: DEBUG oslo_vmware.rw_handles [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2001.579214] env[67964]: DEBUG oslo_vmware.rw_handles [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2001.579406] env[67964]: DEBUG oslo_vmware.rw_handles [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2001.708123] env[67964]: DEBUG nova.network.neutron [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2001.722362] env[67964]: INFO nova.compute.manager [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Took 0.28 seconds to deallocate network for instance. [ 2001.806648] env[67964]: INFO nova.scheduler.client.report [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Deleted allocations for instance da8f11e2-6d58-4e28-aabb-9943bc657e60 [ 2001.827456] env[67964]: DEBUG oslo_concurrency.lockutils [None req-c9a1c418-c2d2-456d-95a5-7d5211ea5cff tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 603.338s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2001.827757] env[67964]: DEBUG oslo_concurrency.lockutils [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 407.571s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.827985] env[67964]: DEBUG oslo_concurrency.lockutils [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "da8f11e2-6d58-4e28-aabb-9943bc657e60-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2001.828218] env[67964]: DEBUG oslo_concurrency.lockutils [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.828386] env[67964]: DEBUG oslo_concurrency.lockutils [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2001.830241] env[67964]: INFO nova.compute.manager [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Terminating instance [ 2001.831961] env[67964]: DEBUG nova.compute.manager [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2001.832171] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2001.832614] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-651aad5e-4bc9-478a-943e-e9cb23e61390 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.841635] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-716d71df-13af-4016-839e-f31cc5281f6a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.870300] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance da8f11e2-6d58-4e28-aabb-9943bc657e60 could not be found. [ 2001.870496] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2001.870666] env[67964]: INFO nova.compute.manager [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2001.870899] env[67964]: DEBUG oslo.service.loopingcall [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2001.871128] env[67964]: DEBUG nova.compute.manager [-] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2001.871224] env[67964]: DEBUG nova.network.neutron [-] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2001.990806] env[67964]: DEBUG nova.network.neutron [-] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2002.004953] env[67964]: INFO nova.compute.manager [-] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] Took 0.13 seconds to deallocate network for instance. [ 2002.100194] env[67964]: DEBUG oslo_concurrency.lockutils [None req-311b6aa5-b682-4f4f-94bb-e3ee348b4f9a tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.272s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2002.101052] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 228.375s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2002.101271] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: da8f11e2-6d58-4e28-aabb-9943bc657e60] During sync_power_state the instance has a pending task (deleting). Skip. [ 2002.101446] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "da8f11e2-6d58-4e28-aabb-9943bc657e60" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2006.578910] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2011.796208] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2040.809432] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "2a0e1c08-8201-4ed7-9072-fdd90f25f120" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2040.809769] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "2a0e1c08-8201-4ed7-9072-fdd90f25f120" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2040.820966] env[67964]: DEBUG nova.compute.manager [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2040.870645] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2040.870888] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2040.872360] env[67964]: INFO nova.compute.claims [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2041.037328] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7708eea2-c41a-4447-bf9b-ee7fcf920c93 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.045429] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b012b769-4001-4626-99a2-c1808c02ebee {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.076689] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05fc449b-c210-456d-953c-2cc13b3b6d75 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.084056] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c94984ce-427c-4f74-8a2e-2f00908083b0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.097437] env[67964]: DEBUG nova.compute.provider_tree [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2041.107168] env[67964]: DEBUG nova.scheduler.client.report [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2041.124782] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2041.125387] env[67964]: DEBUG nova.compute.manager [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2041.161339] env[67964]: DEBUG nova.compute.utils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2041.162551] env[67964]: DEBUG nova.compute.manager [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 2041.162717] env[67964]: DEBUG nova.network.neutron [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2041.170801] env[67964]: DEBUG nova.compute.manager [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2041.229171] env[67964]: DEBUG nova.compute.manager [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2041.252825] env[67964]: DEBUG nova.policy [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7f051b7129e94ac6b20334f348756b49', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b7f4b97c0ca4859964e6ea23310e9ce', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2041.264046] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2041.264301] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2041.264456] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2041.264631] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2041.264776] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2041.264935] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2041.265139] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2041.265294] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2041.265457] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2041.265615] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2041.265784] env[67964]: DEBUG nova.virt.hardware [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2041.266919] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9373e312-679f-4086-9660-3041733ae13a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.274424] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1a19951-4e1a-4b6d-a52d-41f4933e9945 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.548267] env[67964]: DEBUG nova.network.neutron [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Successfully created port: 4c2b4113-e4a7-40bf-84ee-fc649f767bcb {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2042.055300] env[67964]: DEBUG nova.compute.manager [req-53cae0e5-a5f6-4ad7-baaa-17d8247a8ed8 req-a6331988-2c1d-4c61-9f61-65e0ed6b59fe service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Received event network-vif-plugged-4c2b4113-e4a7-40bf-84ee-fc649f767bcb {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2042.055554] env[67964]: DEBUG oslo_concurrency.lockutils [req-53cae0e5-a5f6-4ad7-baaa-17d8247a8ed8 req-a6331988-2c1d-4c61-9f61-65e0ed6b59fe service nova] Acquiring lock "2a0e1c08-8201-4ed7-9072-fdd90f25f120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2042.055710] env[67964]: DEBUG oslo_concurrency.lockutils [req-53cae0e5-a5f6-4ad7-baaa-17d8247a8ed8 req-a6331988-2c1d-4c61-9f61-65e0ed6b59fe service nova] Lock "2a0e1c08-8201-4ed7-9072-fdd90f25f120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2042.055880] env[67964]: DEBUG oslo_concurrency.lockutils [req-53cae0e5-a5f6-4ad7-baaa-17d8247a8ed8 req-a6331988-2c1d-4c61-9f61-65e0ed6b59fe service nova] Lock "2a0e1c08-8201-4ed7-9072-fdd90f25f120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2042.056049] env[67964]: DEBUG nova.compute.manager [req-53cae0e5-a5f6-4ad7-baaa-17d8247a8ed8 req-a6331988-2c1d-4c61-9f61-65e0ed6b59fe service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] No waiting events found dispatching network-vif-plugged-4c2b4113-e4a7-40bf-84ee-fc649f767bcb {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2042.056334] env[67964]: WARNING nova.compute.manager [req-53cae0e5-a5f6-4ad7-baaa-17d8247a8ed8 req-a6331988-2c1d-4c61-9f61-65e0ed6b59fe service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Received unexpected event network-vif-plugged-4c2b4113-e4a7-40bf-84ee-fc649f767bcb for instance with vm_state building and task_state spawning. [ 2042.133218] env[67964]: DEBUG nova.network.neutron [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Successfully updated port: 4c2b4113-e4a7-40bf-84ee-fc649f767bcb {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2042.149933] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "refresh_cache-2a0e1c08-8201-4ed7-9072-fdd90f25f120" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2042.150486] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired lock "refresh_cache-2a0e1c08-8201-4ed7-9072-fdd90f25f120" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2042.150486] env[67964]: DEBUG nova.network.neutron [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2042.201130] env[67964]: DEBUG nova.network.neutron [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2042.582602] env[67964]: DEBUG nova.network.neutron [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Updating instance_info_cache with network_info: [{"id": "4c2b4113-e4a7-40bf-84ee-fc649f767bcb", "address": "fa:16:3e:99:27:58", "network": {"id": "aeaa87c8-704e-479e-9bca-e70f676fcf32", "bridge": "br-int", "label": "tempest-ServersTestJSON-644115068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b7f4b97c0ca4859964e6ea23310e9ce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4c2b4113-e4", "ovs_interfaceid": "4c2b4113-e4a7-40bf-84ee-fc649f767bcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2042.595114] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Releasing lock "refresh_cache-2a0e1c08-8201-4ed7-9072-fdd90f25f120" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2042.595450] env[67964]: DEBUG nova.compute.manager [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Instance network_info: |[{"id": "4c2b4113-e4a7-40bf-84ee-fc649f767bcb", "address": "fa:16:3e:99:27:58", "network": {"id": "aeaa87c8-704e-479e-9bca-e70f676fcf32", "bridge": "br-int", "label": "tempest-ServersTestJSON-644115068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b7f4b97c0ca4859964e6ea23310e9ce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4c2b4113-e4", "ovs_interfaceid": "4c2b4113-e4a7-40bf-84ee-fc649f767bcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2042.595821] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:99:27:58', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4c2b4113-e4a7-40bf-84ee-fc649f767bcb', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2042.603188] env[67964]: DEBUG oslo.service.loopingcall [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2042.603617] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2042.603842] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0f1cfd35-1512-444f-a6e2-26084b25f8f4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2042.628445] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2042.628445] env[67964]: value = "task-3456886" [ 2042.628445] env[67964]: _type = "Task" [ 2042.628445] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2042.636159] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456886, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2043.139073] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456886, 'name': CreateVM_Task, 'duration_secs': 0.276726} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2043.139463] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2043.139911] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2043.140091] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2043.140404] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2043.140651] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f31247f2-1a5b-4792-a2fa-023b519fefea {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2043.144890] env[67964]: DEBUG oslo_vmware.api [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for the task: (returnval){ [ 2043.144890] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]527a421e-bb6d-dbfb-39ad-34da8bd10278" [ 2043.144890] env[67964]: _type = "Task" [ 2043.144890] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2043.151893] env[67964]: DEBUG oslo_vmware.api [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]527a421e-bb6d-dbfb-39ad-34da8bd10278, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2043.656268] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2043.656524] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2043.656733] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2044.084533] env[67964]: DEBUG nova.compute.manager [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Received event network-changed-4c2b4113-e4a7-40bf-84ee-fc649f767bcb {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2044.084717] env[67964]: DEBUG nova.compute.manager [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Refreshing instance network info cache due to event network-changed-4c2b4113-e4a7-40bf-84ee-fc649f767bcb. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2044.084919] env[67964]: DEBUG oslo_concurrency.lockutils [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] Acquiring lock "refresh_cache-2a0e1c08-8201-4ed7-9072-fdd90f25f120" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2044.085073] env[67964]: DEBUG oslo_concurrency.lockutils [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] Acquired lock "refresh_cache-2a0e1c08-8201-4ed7-9072-fdd90f25f120" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2044.085236] env[67964]: DEBUG nova.network.neutron [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Refreshing network info cache for port 4c2b4113-e4a7-40bf-84ee-fc649f767bcb {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2044.306436] env[67964]: DEBUG nova.network.neutron [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Updated VIF entry in instance network info cache for port 4c2b4113-e4a7-40bf-84ee-fc649f767bcb. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2044.306781] env[67964]: DEBUG nova.network.neutron [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Updating instance_info_cache with network_info: [{"id": "4c2b4113-e4a7-40bf-84ee-fc649f767bcb", "address": "fa:16:3e:99:27:58", "network": {"id": "aeaa87c8-704e-479e-9bca-e70f676fcf32", "bridge": "br-int", "label": "tempest-ServersTestJSON-644115068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b7f4b97c0ca4859964e6ea23310e9ce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4c2b4113-e4", "ovs_interfaceid": "4c2b4113-e4a7-40bf-84ee-fc649f767bcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2044.315713] env[67964]: DEBUG oslo_concurrency.lockutils [req-d6ce5397-b18a-4831-bc99-d7847d0674eb req-698d0336-128d-4046-9ab8-a18fcc545648 service nova] Releasing lock "refresh_cache-2a0e1c08-8201-4ed7-9072-fdd90f25f120" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2047.800758] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2047.812412] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2047.812639] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2047.812787] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2047.812940] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2047.814094] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd34e7f5-8864-4bf6-bcb1-af17181b122b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.822725] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a9d9ab3-8969-4c12-85bc-0261f06f4c11 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.836206] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c618980-b4c9-41e6-9283-91dc8642acad {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.838933] env[67964]: WARNING oslo_vmware.rw_handles [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2047.838933] env[67964]: ERROR oslo_vmware.rw_handles [ 2047.839372] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2047.841337] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2047.841573] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Copying Virtual Disk [datastore1] vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/8d247521-6d94-4d2d-94ed-0871848da372/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2047.841793] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e4fbf340-ed82-4873-b69b-c24c148be0e5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.849318] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a5c9f15-53b7-4090-8735-df3264c260cc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.853490] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 2047.853490] env[67964]: value = "task-3456887" [ 2047.853490] env[67964]: _type = "Task" [ 2047.853490] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2047.881152] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180902MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2047.881358] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2047.881697] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2047.888527] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456887, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2047.953278] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.953463] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.953633] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.953765] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.953887] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.954049] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.954186] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.954332] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.954419] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.954570] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2a0e1c08-8201-4ed7-9072-fdd90f25f120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2047.954781] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2047.954921] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2048.082828] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68b2bc01-a3e5-4ce7-9feb-d53163a93654 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.090639] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02e37da8-71a4-4a9f-8e65-531a6ca097af {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.121142] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f4fed72-0ff1-4c39-bdce-f19bf164af25 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.127830] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bfd4bd7-5c74-4ad4-8d89-2aeea880331f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.141013] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2048.149972] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2048.165203] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2048.165387] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2048.364247] env[67964]: DEBUG oslo_vmware.exceptions [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2048.364571] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2048.365142] env[67964]: ERROR nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2048.365142] env[67964]: Faults: ['InvalidArgument'] [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Traceback (most recent call last): [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] yield resources [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self.driver.spawn(context, instance, image_meta, [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self._fetch_image_if_missing(context, vi) [ 2048.365142] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] image_cache(vi, tmp_image_ds_loc) [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] vm_util.copy_virtual_disk( [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] session._wait_for_task(vmdk_copy_task) [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] return self.wait_for_task(task_ref) [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] return evt.wait() [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] result = hub.switch() [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2048.365527] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] return self.greenlet.switch() [ 2048.365928] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2048.365928] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self.f(*self.args, **self.kw) [ 2048.365928] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2048.365928] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] raise exceptions.translate_fault(task_info.error) [ 2048.365928] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2048.365928] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Faults: ['InvalidArgument'] [ 2048.365928] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] [ 2048.365928] env[67964]: INFO nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Terminating instance [ 2048.367013] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2048.367246] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2048.367498] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-514f2018-ddf1-48ff-b56d-e21155030696 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.369877] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2048.370080] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2048.370804] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4916fae5-0c8a-4215-99ab-6c8ef1e943bf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.377610] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2048.377828] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a36929d5-1070-458a-88ec-b1b58a5a1ac1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.380121] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2048.380297] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2048.381253] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e3bd7a6d-81f0-4a55-9917-5f0afbbf439b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.386453] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 2048.386453] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52c7a9b8-0a79-f045-f39e-ea6cf472b918" [ 2048.386453] env[67964]: _type = "Task" [ 2048.386453] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2048.400629] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2048.400854] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating directory with path [datastore1] vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2048.401095] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c98da8cf-432b-482c-b0bb-cfba96923bbb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.421283] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created directory with path [datastore1] vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2048.421490] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Fetch image to [datastore1] vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2048.421657] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2048.422485] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48a8ccc5-b5b8-4279-b43a-cdf34b664ab8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.429873] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-281ed2a7-4c34-4cbd-9ad4-16567110c4dc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.440386] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a34f0ce-a8dd-4784-9bfb-8fa2442291b7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.445265] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2048.445470] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2048.445642] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleting the datastore file [datastore1] 2c06844d-2c7f-4e27-b3c6-16dfd6047119 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2048.446259] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f8852865-a42c-4352-ac8b-610419b82bdf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.476386] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1031f10d-c507-4410-b763-9234abd864ea {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.479213] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 2048.479213] env[67964]: value = "task-3456889" [ 2048.479213] env[67964]: _type = "Task" [ 2048.479213] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2048.484570] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6ea83a26-f06d-45b2-a87b-e381daef299f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.488962] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456889, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2048.508925] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2048.558936] env[67964]: DEBUG oslo_vmware.rw_handles [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2048.617443] env[67964]: DEBUG oslo_vmware.rw_handles [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2048.617626] env[67964]: DEBUG oslo_vmware.rw_handles [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2048.989972] env[67964]: DEBUG oslo_vmware.api [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456889, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080345} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2048.990289] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2048.990333] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2048.990479] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2048.990657] env[67964]: INFO nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2048.992871] env[67964]: DEBUG nova.compute.claims [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2048.993052] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2048.993275] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.164738] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5af9b41-6640-4d0a-bf95-d184e35c7b4f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.171842] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea137c9-91db-48f8-b395-89853a0aa899 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.200741] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c52dc74-70ee-4bc6-9afc-42886e3d5567 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.208066] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b87ca990-2ad5-4eb2-bf24-841ef9f32300 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.220251] env[67964]: DEBUG nova.compute.provider_tree [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2049.228433] env[67964]: DEBUG nova.scheduler.client.report [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2049.240884] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.248s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.241435] env[67964]: ERROR nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2049.241435] env[67964]: Faults: ['InvalidArgument'] [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Traceback (most recent call last): [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self.driver.spawn(context, instance, image_meta, [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self._fetch_image_if_missing(context, vi) [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] image_cache(vi, tmp_image_ds_loc) [ 2049.241435] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] vm_util.copy_virtual_disk( [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] session._wait_for_task(vmdk_copy_task) [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] return self.wait_for_task(task_ref) [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] return evt.wait() [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] result = hub.switch() [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] return self.greenlet.switch() [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2049.241756] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] self.f(*self.args, **self.kw) [ 2049.241980] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2049.241980] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] raise exceptions.translate_fault(task_info.error) [ 2049.241980] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2049.241980] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Faults: ['InvalidArgument'] [ 2049.241980] env[67964]: ERROR nova.compute.manager [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] [ 2049.242207] env[67964]: DEBUG nova.compute.utils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2049.243504] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Build of instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 was re-scheduled: A specified parameter was not correct: fileType [ 2049.243504] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2049.243867] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2049.244077] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2049.244243] env[67964]: DEBUG nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2049.244597] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2049.541058] env[67964]: DEBUG nova.network.neutron [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2049.550266] env[67964]: INFO nova.compute.manager [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Took 0.31 seconds to deallocate network for instance. [ 2049.642912] env[67964]: INFO nova.scheduler.client.report [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleted allocations for instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 [ 2049.664388] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43e53763-c08e-4725-bf18-38da20a70084 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 647.561s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.664683] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 451.779s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.664911] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2049.665133] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.665350] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.667531] env[67964]: INFO nova.compute.manager [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Terminating instance [ 2049.669283] env[67964]: DEBUG nova.compute.manager [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2049.669484] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2049.669976] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6a45e523-b13f-4f33-b0fb-d86f07c0b5d4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.679959] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce70cff6-f472-4ae9-8a3b-100c4d44858f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.708343] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2c06844d-2c7f-4e27-b3c6-16dfd6047119 could not be found. [ 2049.708548] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2049.708722] env[67964]: INFO nova.compute.manager [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2049.708959] env[67964]: DEBUG oslo.service.loopingcall [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2049.709208] env[67964]: DEBUG nova.compute.manager [-] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2049.709357] env[67964]: DEBUG nova.network.neutron [-] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2049.747168] env[67964]: DEBUG nova.network.neutron [-] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2049.755241] env[67964]: INFO nova.compute.manager [-] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] Took 0.05 seconds to deallocate network for instance. [ 2049.835176] env[67964]: DEBUG oslo_concurrency.lockutils [None req-43a90994-2e6c-43e7-a497-9047531cfc1c tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.170s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.835990] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 276.110s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.836192] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2c06844d-2c7f-4e27-b3c6-16dfd6047119] During sync_power_state the instance has a pending task (deleting). Skip. [ 2049.836367] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "2c06844d-2c7f-4e27-b3c6-16dfd6047119" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2052.801632] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2052.801632] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 2052.812598] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] There are 0 instances to clean {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 2055.812599] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2055.812854] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2056.796485] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2056.800117] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2057.801076] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2057.801379] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2058.807123] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2058.807457] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2058.807457] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2058.827134] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.827284] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.827420] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.827545] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.827666] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.827784] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.827901] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.828027] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.828149] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2058.828266] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2058.828719] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2058.828906] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2061.799965] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2069.801700] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2069.803227] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances with incomplete migration {{(pid=67964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 2097.852286] env[67964]: WARNING oslo_vmware.rw_handles [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2097.852286] env[67964]: ERROR oslo_vmware.rw_handles [ 2097.852833] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2097.854777] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2097.855071] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Copying Virtual Disk [datastore1] vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/6504c042-bd22-40cd-9039-eb6b37853826/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2097.855353] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-63c41e8c-2b85-42a8-ae05-6169639e4f68 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.863851] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 2097.863851] env[67964]: value = "task-3456890" [ 2097.863851] env[67964]: _type = "Task" [ 2097.863851] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2097.871795] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456890, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2098.373964] env[67964]: DEBUG oslo_vmware.exceptions [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2098.374274] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2098.374909] env[67964]: ERROR nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2098.374909] env[67964]: Faults: ['InvalidArgument'] [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Traceback (most recent call last): [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] yield resources [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self.driver.spawn(context, instance, image_meta, [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self._fetch_image_if_missing(context, vi) [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2098.374909] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] image_cache(vi, tmp_image_ds_loc) [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] vm_util.copy_virtual_disk( [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] session._wait_for_task(vmdk_copy_task) [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] return self.wait_for_task(task_ref) [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] return evt.wait() [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] result = hub.switch() [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] return self.greenlet.switch() [ 2098.375242] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2098.375517] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self.f(*self.args, **self.kw) [ 2098.375517] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2098.375517] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] raise exceptions.translate_fault(task_info.error) [ 2098.375517] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2098.375517] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Faults: ['InvalidArgument'] [ 2098.375517] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] [ 2098.375517] env[67964]: INFO nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Terminating instance [ 2098.376740] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2098.376941] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2098.377197] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0382e910-6bd2-4ae3-a8a3-5b1fd0a0fdf1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.379323] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2098.379511] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2098.380252] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abbd57d1-12bf-44f2-a3fb-01b3d05b7de0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.387135] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2098.387355] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3f1c11c0-47fd-46dd-a838-4571e4fa12b0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.389412] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2098.389582] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2098.390529] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6fe834d9-6bc2-4590-bdcb-c3f1cc89e57f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.395460] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 2098.395460] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ed29b0-0a23-5370-f45e-a8ad80736a8d" [ 2098.395460] env[67964]: _type = "Task" [ 2098.395460] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2098.402456] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52ed29b0-0a23-5370-f45e-a8ad80736a8d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2098.458156] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2098.458391] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2098.458539] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleting the datastore file [datastore1] 41d93bf8-7991-4b52-8ebb-a1988dc627c1 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2098.458800] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8f9c5120-5e8b-4c5c-b36e-52a286b51bc5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.464785] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 2098.464785] env[67964]: value = "task-3456892" [ 2098.464785] env[67964]: _type = "Task" [ 2098.464785] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2098.472263] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456892, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2098.906351] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2098.906712] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating directory with path [datastore1] vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2098.906849] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1ee6f6d9-fa96-4286-acd7-0cc06ac7b059 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.918277] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Created directory with path [datastore1] vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2098.918470] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Fetch image to [datastore1] vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2098.918633] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2098.919386] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fac5e42a-b693-4be0-adad-a432af8563bb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.925966] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2abca1b-a21f-41d0-abb1-7e8ec0c61094 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.934786] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cd273d7-6268-4d4f-b570-3687a809da09 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.965502] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b363c9da-a632-485f-9f08-5d9d958c3538 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.976171] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6492fda4-f23c-49de-ba3f-aee693e6683b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.977784] env[67964]: DEBUG oslo_vmware.api [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456892, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067434} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2098.978026] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2098.978208] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2098.978376] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2098.978541] env[67964]: INFO nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2098.980612] env[67964]: DEBUG nova.compute.claims [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2098.980797] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2098.981034] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2098.998950] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2099.048385] env[67964]: DEBUG oslo_vmware.rw_handles [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2099.108719] env[67964]: DEBUG oslo_vmware.rw_handles [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2099.109016] env[67964]: DEBUG oslo_vmware.rw_handles [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2099.186240] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98f9c706-af8e-4841-bb6d-ed7b41d55a00 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.193699] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a300733-a6e0-4cce-93e8-6795692948c1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.222964] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96fb69f5-2abd-4496-ac9d-af1634cf6a0b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.229823] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9f29f61-fd5d-49ca-85e7-6f394e7bb3ea {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.243415] env[67964]: DEBUG nova.compute.provider_tree [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2099.251464] env[67964]: DEBUG nova.scheduler.client.report [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2099.264610] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.284s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.265201] env[67964]: ERROR nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2099.265201] env[67964]: Faults: ['InvalidArgument'] [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Traceback (most recent call last): [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self.driver.spawn(context, instance, image_meta, [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self._fetch_image_if_missing(context, vi) [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] image_cache(vi, tmp_image_ds_loc) [ 2099.265201] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] vm_util.copy_virtual_disk( [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] session._wait_for_task(vmdk_copy_task) [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] return self.wait_for_task(task_ref) [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] return evt.wait() [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] result = hub.switch() [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] return self.greenlet.switch() [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2099.265540] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] self.f(*self.args, **self.kw) [ 2099.265800] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2099.265800] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] raise exceptions.translate_fault(task_info.error) [ 2099.265800] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2099.265800] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Faults: ['InvalidArgument'] [ 2099.265800] env[67964]: ERROR nova.compute.manager [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] [ 2099.265959] env[67964]: DEBUG nova.compute.utils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2099.267234] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Build of instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 was re-scheduled: A specified parameter was not correct: fileType [ 2099.267234] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2099.267604] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2099.267773] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2099.267940] env[67964]: DEBUG nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2099.268111] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2099.533877] env[67964]: DEBUG nova.network.neutron [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2099.547840] env[67964]: INFO nova.compute.manager [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Took 0.28 seconds to deallocate network for instance. [ 2099.633761] env[67964]: INFO nova.scheduler.client.report [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleted allocations for instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 [ 2099.660867] env[67964]: DEBUG oslo_concurrency.lockutils [None req-fd38a148-0574-4806-a3da-b2035d518cf8 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 681.867s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.660867] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 486.508s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.660867] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2099.660867] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.661160] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.661895] env[67964]: INFO nova.compute.manager [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Terminating instance [ 2099.663834] env[67964]: DEBUG nova.compute.manager [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2099.664192] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2099.664764] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9fc72855-4709-44e8-b4c7-4b2c70e00adb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.675997] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ad424e9-4bfa-4dac-bf32-13b9b83166fb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.705483] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 41d93bf8-7991-4b52-8ebb-a1988dc627c1 could not be found. [ 2099.705667] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2099.705853] env[67964]: INFO nova.compute.manager [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2099.706116] env[67964]: DEBUG oslo.service.loopingcall [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2099.706342] env[67964]: DEBUG nova.compute.manager [-] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2099.706437] env[67964]: DEBUG nova.network.neutron [-] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2099.727937] env[67964]: DEBUG nova.network.neutron [-] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2099.735691] env[67964]: INFO nova.compute.manager [-] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] Took 0.03 seconds to deallocate network for instance. [ 2099.814607] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e476c577-57da-4efd-b6df-0dbb80ab5e47 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.155s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.815443] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 326.089s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.815628] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 41d93bf8-7991-4b52-8ebb-a1988dc627c1] During sync_power_state the instance has a pending task (deleting). Skip. [ 2099.815830] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "41d93bf8-7991-4b52-8ebb-a1988dc627c1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2108.811062] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2108.823825] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2108.824070] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2108.824244] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2108.824399] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2108.825540] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cdcdc3f-4786-4c8a-bf64-07dbfe5d908c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2108.835114] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-665db838-4fc5-477e-a794-4f530a464937 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2108.848664] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf8e7d8-9e95-4db3-9255-3a069be5c253 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2108.855158] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6c8d2f1-6c7b-4279-b750-beae4cb9ac1c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2108.883011] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180841MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2108.883173] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2108.883367] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2108.972122] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 430cad73-6b2c-4702-96a0-672f5b4c219f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.972290] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.972421] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.972544] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.972662] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.972778] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.972892] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.973018] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2a0e1c08-8201-4ed7-9072-fdd90f25f120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2108.973218] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2108.973355] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2108.990760] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing inventories for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:818}} [ 2109.003273] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating ProviderTree inventory for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:782}} [ 2109.003447] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2109.013249] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing aggregate associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, aggregates: None {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:827}} [ 2109.029878] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing trait associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:839}} [ 2109.119648] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddded0c9-55fa-4ddc-b943-ad0f7add6723 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.127284] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2de09290-c5f5-482e-8138-65ce926a40da {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.156845] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6425c78-a071-41b3-a3c7-2e4eb683b5b3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.163705] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7c9a8e8-2c81-4233-8bbd-b99487a1a7f9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.176373] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2109.184390] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2109.197295] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2109.197490] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.314s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2117.191255] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2117.795984] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2117.799837] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2117.799837] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2117.799837] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2119.801673] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2119.801986] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2120.802046] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2120.802046] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2120.802046] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2120.819502] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.819672] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.819779] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.819905] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.820036] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.820162] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.820286] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.820402] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2120.820519] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2121.800300] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2134.796027] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2144.762096] env[67964]: WARNING oslo_vmware.rw_handles [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2144.762096] env[67964]: ERROR oslo_vmware.rw_handles [ 2144.762096] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2144.764510] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2144.764800] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Copying Virtual Disk [datastore1] vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/5b3287c8-31ce-4566-9ee2-e829e24f081a/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2144.765111] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-828fd6c3-9759-49d5-8ced-1c068534218a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2144.772879] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 2144.772879] env[67964]: value = "task-3456893" [ 2144.772879] env[67964]: _type = "Task" [ 2144.772879] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2144.781454] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': task-3456893, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2145.283416] env[67964]: DEBUG oslo_vmware.exceptions [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2145.283718] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2145.284287] env[67964]: ERROR nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.284287] env[67964]: Faults: ['InvalidArgument'] [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Traceback (most recent call last): [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] yield resources [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self.driver.spawn(context, instance, image_meta, [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self._fetch_image_if_missing(context, vi) [ 2145.284287] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] image_cache(vi, tmp_image_ds_loc) [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] vm_util.copy_virtual_disk( [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] session._wait_for_task(vmdk_copy_task) [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] return self.wait_for_task(task_ref) [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] return evt.wait() [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] result = hub.switch() [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2145.284661] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] return self.greenlet.switch() [ 2145.285036] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2145.285036] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self.f(*self.args, **self.kw) [ 2145.285036] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2145.285036] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] raise exceptions.translate_fault(task_info.error) [ 2145.285036] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.285036] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Faults: ['InvalidArgument'] [ 2145.285036] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] [ 2145.285036] env[67964]: INFO nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Terminating instance [ 2145.286222] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2145.286433] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2145.287048] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ef94636-bdaf-4b1b-886c-681ee512e3f0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.289126] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2145.289388] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2145.290108] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50cb470d-2573-459c-b0ff-e782995605c9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.296936] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2145.297148] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c19fe1dd-d504-41d1-a6a5-3b92500de880 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.299235] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2145.299613] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2145.300410] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-90765866-6e22-428f-97c0-be8d537a94b3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.305222] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 2145.305222] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]527dcf42-0c7a-034a-e084-7ce55076daa2" [ 2145.305222] env[67964]: _type = "Task" [ 2145.305222] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2145.312217] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]527dcf42-0c7a-034a-e084-7ce55076daa2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2145.372499] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2145.373010] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2145.373317] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Deleting the datastore file [datastore1] 430cad73-6b2c-4702-96a0-672f5b4c219f {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2145.373684] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-61b6bf0e-46cd-45e8-92dc-bfde72a388dc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.380248] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for the task: (returnval){ [ 2145.380248] env[67964]: value = "task-3456895" [ 2145.380248] env[67964]: _type = "Task" [ 2145.380248] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2145.390901] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': task-3456895, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2145.816047] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2145.816406] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating directory with path [datastore1] vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2145.816543] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e103f829-62a8-40f1-9a37-dc73c6191d48 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.827176] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Created directory with path [datastore1] vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2145.827345] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Fetch image to [datastore1] vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2145.827509] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2145.828213] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1a4415e-3ff9-424c-b25f-cf26c20432f7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.834630] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7f7fdb9-703e-40f9-961c-7602ac27f86f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.843126] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-479b0e31-168e-44fd-9f14-46db41f7433b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.872760] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2395114-412f-4127-a4c8-a4c895553d42 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.877909] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-95055177-d7d0-4425-9c7f-3ba07a0603e4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.888838] env[67964]: DEBUG oslo_vmware.api [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Task: {'id': task-3456895, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063923} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2145.889065] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2145.889244] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2145.889412] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2145.889645] env[67964]: INFO nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2145.891802] env[67964]: DEBUG nova.compute.claims [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2145.892087] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2145.892387] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2145.901100] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2145.949705] env[67964]: DEBUG oslo_vmware.rw_handles [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2146.010775] env[67964]: DEBUG oslo_vmware.rw_handles [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2146.011065] env[67964]: DEBUG oslo_vmware.rw_handles [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2146.088429] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df82e9d4-9ab0-4f4f-9f1f-d9bbe7b42e4b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.096056] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-206f76e9-e9c1-4ccd-a657-97129b887f97 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.126745] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-347f8096-9661-4833-803c-75c620927032 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.133568] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a793bd-e66f-46cc-922f-bfd5ee3913e2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.146428] env[67964]: DEBUG nova.compute.provider_tree [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2146.155111] env[67964]: DEBUG nova.scheduler.client.report [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2146.168292] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.276s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.168799] env[67964]: ERROR nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2146.168799] env[67964]: Faults: ['InvalidArgument'] [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Traceback (most recent call last): [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self.driver.spawn(context, instance, image_meta, [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self._fetch_image_if_missing(context, vi) [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] image_cache(vi, tmp_image_ds_loc) [ 2146.168799] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] vm_util.copy_virtual_disk( [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] session._wait_for_task(vmdk_copy_task) [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] return self.wait_for_task(task_ref) [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] return evt.wait() [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] result = hub.switch() [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] return self.greenlet.switch() [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2146.169205] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] self.f(*self.args, **self.kw) [ 2146.169704] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2146.169704] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] raise exceptions.translate_fault(task_info.error) [ 2146.169704] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2146.169704] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Faults: ['InvalidArgument'] [ 2146.169704] env[67964]: ERROR nova.compute.manager [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] [ 2146.169704] env[67964]: DEBUG nova.compute.utils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2146.170772] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Build of instance 430cad73-6b2c-4702-96a0-672f5b4c219f was re-scheduled: A specified parameter was not correct: fileType [ 2146.170772] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2146.171175] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2146.171345] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2146.171512] env[67964]: DEBUG nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2146.171672] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2146.487026] env[67964]: DEBUG nova.network.neutron [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2146.503555] env[67964]: INFO nova.compute.manager [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Took 0.33 seconds to deallocate network for instance. [ 2146.595608] env[67964]: INFO nova.scheduler.client.report [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Deleted allocations for instance 430cad73-6b2c-4702-96a0-672f5b4c219f [ 2146.620027] env[67964]: DEBUG oslo_concurrency.lockutils [None req-18f6b0eb-d3d7-4196-b8ec-3b328f19afeb tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 556.824s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.620159] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 372.893s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.620342] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] During sync_power_state the instance has a pending task (spawning). Skip. [ 2146.620510] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.620721] env[67964]: DEBUG oslo_concurrency.lockutils [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 361.187s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.620929] env[67964]: DEBUG oslo_concurrency.lockutils [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Acquiring lock "430cad73-6b2c-4702-96a0-672f5b4c219f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2146.621156] env[67964]: DEBUG oslo_concurrency.lockutils [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.621316] env[67964]: DEBUG oslo_concurrency.lockutils [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.623450] env[67964]: INFO nova.compute.manager [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Terminating instance [ 2146.624918] env[67964]: DEBUG nova.compute.manager [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2146.625125] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2146.625632] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c32feaaf-bcf6-47f0-aa14-1c0f8b0b4f21 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.634808] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4961d918-756f-48e9-98ac-a51627394abb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.663120] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 430cad73-6b2c-4702-96a0-672f5b4c219f could not be found. [ 2146.663279] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2146.663467] env[67964]: INFO nova.compute.manager [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2146.663711] env[67964]: DEBUG oslo.service.loopingcall [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2146.663930] env[67964]: DEBUG nova.compute.manager [-] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2146.664037] env[67964]: DEBUG nova.network.neutron [-] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2146.686463] env[67964]: DEBUG nova.network.neutron [-] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2146.693933] env[67964]: INFO nova.compute.manager [-] [instance: 430cad73-6b2c-4702-96a0-672f5b4c219f] Took 0.03 seconds to deallocate network for instance. [ 2146.774607] env[67964]: DEBUG oslo_concurrency.lockutils [None req-880355fb-a06f-4a50-a83b-b20b6ada56d5 tempest-AttachInterfacesTestJSON-599720939 tempest-AttachInterfacesTestJSON-599720939-project-member] Lock "430cad73-6b2c-4702-96a0-672f5b4c219f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.800271] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2169.810688] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2169.810901] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2169.811079] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.811237] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2169.812311] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df1739e-e040-4bd1-b0bd-d2cda1c4ddfa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.821274] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31fabce1-7ea2-4093-b43b-338ad76c95f2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.834705] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4493ad1a-4072-4b86-858c-5295b37479aa {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.840632] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-719d99eb-69f6-4e38-8eec-2d7429a86ff2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.868095] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180873MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2169.868226] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2169.868402] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2169.927026] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance bc98edf7-889e-4814-b859-d860033ba0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2169.927026] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2169.927026] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2169.927026] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2169.927208] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2169.927208] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2169.927346] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2a0e1c08-8201-4ed7-9072-fdd90f25f120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2169.927488] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2169.927626] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2170.011528] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ffb0119-b65c-4c4c-87fa-f7338e42d8c8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2170.018815] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c5b096c-7413-4c8c-8c44-af6451a2bdfc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2170.047580] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3f62055-91b9-4c59-ab8e-d938382396ef {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2170.054194] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d1d7dba-f689-47c6-acf0-062b5c5a4c25 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2170.066738] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2170.075563] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2170.088732] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2170.088898] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2177.089898] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2178.800780] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2179.796508] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2179.800090] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2179.800281] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2179.800427] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2180.802155] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2181.800927] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2181.801191] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2181.801239] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2181.817370] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2181.817753] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2181.817753] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2181.817753] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2181.817871] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2181.817989] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2181.818125] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2181.818245] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2182.799986] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2184.194144] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2184.194465] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2184.204725] env[67964]: DEBUG nova.compute.manager [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2184.266245] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2184.266497] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2184.267886] env[67964]: INFO nova.compute.claims [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2184.403800] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e0e6bdc-3870-4ea5-8e4e-56f2943377bf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2184.411577] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dddf2c0-003b-4108-bd6d-5852bc53dd57 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2184.440120] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90222acc-b68e-4a4d-b039-92bee9dbf635 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2184.447069] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e0c67a8-5663-4cb2-af72-6b4f0e1207dd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2184.459718] env[67964]: DEBUG nova.compute.provider_tree [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2184.488246] env[67964]: DEBUG nova.scheduler.client.report [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2184.500859] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2184.501368] env[67964]: DEBUG nova.compute.manager [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2184.531257] env[67964]: DEBUG nova.compute.utils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2184.532385] env[67964]: DEBUG nova.compute.manager [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 2184.532554] env[67964]: DEBUG nova.network.neutron [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2184.540274] env[67964]: DEBUG nova.compute.manager [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2184.597934] env[67964]: DEBUG nova.policy [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'daaca12089eb4485b5607a9d577f33b2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '83336cd0155c4286b66ac327ef1385b5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2184.600894] env[67964]: DEBUG nova.compute.manager [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2184.625401] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2184.625633] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2184.625784] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2184.625961] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2184.626356] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2184.626356] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2184.626488] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2184.626636] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2184.626801] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2184.626960] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2184.627163] env[67964]: DEBUG nova.virt.hardware [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2184.628022] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c977f9ed-5a78-49fc-b2a0-04927fabc462 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2184.635700] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeb72a40-1943-4641-8254-b1518ae5da25 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2184.948865] env[67964]: DEBUG nova.network.neutron [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Successfully created port: 92d9c473-305a-4ffa-a953-e8d90fc1f84a {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2185.517937] env[67964]: DEBUG nova.compute.manager [req-02302448-32d8-4d95-a2a7-e2016846b06d req-2ff8c6e2-c2b2-4e79-8cac-2b37abb411e6 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Received event network-vif-plugged-92d9c473-305a-4ffa-a953-e8d90fc1f84a {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2185.518204] env[67964]: DEBUG oslo_concurrency.lockutils [req-02302448-32d8-4d95-a2a7-e2016846b06d req-2ff8c6e2-c2b2-4e79-8cac-2b37abb411e6 service nova] Acquiring lock "1327b0f7-bc48-4475-8e12-7dcb7bcf28b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2185.518372] env[67964]: DEBUG oslo_concurrency.lockutils [req-02302448-32d8-4d95-a2a7-e2016846b06d req-2ff8c6e2-c2b2-4e79-8cac-2b37abb411e6 service nova] Lock "1327b0f7-bc48-4475-8e12-7dcb7bcf28b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2185.518537] env[67964]: DEBUG oslo_concurrency.lockutils [req-02302448-32d8-4d95-a2a7-e2016846b06d req-2ff8c6e2-c2b2-4e79-8cac-2b37abb411e6 service nova] Lock "1327b0f7-bc48-4475-8e12-7dcb7bcf28b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2185.518699] env[67964]: DEBUG nova.compute.manager [req-02302448-32d8-4d95-a2a7-e2016846b06d req-2ff8c6e2-c2b2-4e79-8cac-2b37abb411e6 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] No waiting events found dispatching network-vif-plugged-92d9c473-305a-4ffa-a953-e8d90fc1f84a {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2185.518854] env[67964]: WARNING nova.compute.manager [req-02302448-32d8-4d95-a2a7-e2016846b06d req-2ff8c6e2-c2b2-4e79-8cac-2b37abb411e6 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Received unexpected event network-vif-plugged-92d9c473-305a-4ffa-a953-e8d90fc1f84a for instance with vm_state building and task_state spawning. [ 2185.592461] env[67964]: DEBUG nova.network.neutron [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Successfully updated port: 92d9c473-305a-4ffa-a953-e8d90fc1f84a {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2185.602962] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "refresh_cache-1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2185.603116] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "refresh_cache-1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2185.603266] env[67964]: DEBUG nova.network.neutron [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2185.640847] env[67964]: DEBUG nova.network.neutron [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2185.790897] env[67964]: DEBUG nova.network.neutron [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Updating instance_info_cache with network_info: [{"id": "92d9c473-305a-4ffa-a953-e8d90fc1f84a", "address": "fa:16:3e:2e:e2:8f", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92d9c473-30", "ovs_interfaceid": "92d9c473-305a-4ffa-a953-e8d90fc1f84a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2185.800998] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "refresh_cache-1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2185.801281] env[67964]: DEBUG nova.compute.manager [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Instance network_info: |[{"id": "92d9c473-305a-4ffa-a953-e8d90fc1f84a", "address": "fa:16:3e:2e:e2:8f", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92d9c473-30", "ovs_interfaceid": "92d9c473-305a-4ffa-a953-e8d90fc1f84a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2185.801669] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2e:e2:8f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '50886eea-591a-452c-a27b-5f22cfc9df85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '92d9c473-305a-4ffa-a953-e8d90fc1f84a', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2185.809198] env[67964]: DEBUG oslo.service.loopingcall [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2185.809615] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2185.809835] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3a6d826a-e6ff-435d-bf6f-f1e8e575ec4e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2185.830272] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2185.830272] env[67964]: value = "task-3456896" [ 2185.830272] env[67964]: _type = "Task" [ 2185.830272] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2185.837717] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456896, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2186.340739] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456896, 'name': CreateVM_Task, 'duration_secs': 0.288017} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2186.340968] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2186.341713] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2186.341881] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2186.342323] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2186.342589] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-af0d5eb4-f27f-4796-9565-1384ea6d258e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.347502] env[67964]: DEBUG oslo_vmware.api [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 2186.347502] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]520b0537-06a3-cedb-1a01-646528437d9a" [ 2186.347502] env[67964]: _type = "Task" [ 2186.347502] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2186.355452] env[67964]: DEBUG oslo_vmware.api [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]520b0537-06a3-cedb-1a01-646528437d9a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2186.858790] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2186.859133] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2186.859245] env[67964]: DEBUG oslo_concurrency.lockutils [None req-2743f67b-779d-4e98-86a2-8480d18f0974 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2187.554229] env[67964]: DEBUG nova.compute.manager [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Received event network-changed-92d9c473-305a-4ffa-a953-e8d90fc1f84a {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2187.554422] env[67964]: DEBUG nova.compute.manager [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Refreshing instance network info cache due to event network-changed-92d9c473-305a-4ffa-a953-e8d90fc1f84a. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2187.554629] env[67964]: DEBUG oslo_concurrency.lockutils [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] Acquiring lock "refresh_cache-1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2187.554770] env[67964]: DEBUG oslo_concurrency.lockutils [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] Acquired lock "refresh_cache-1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2187.554929] env[67964]: DEBUG nova.network.neutron [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Refreshing network info cache for port 92d9c473-305a-4ffa-a953-e8d90fc1f84a {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2187.826629] env[67964]: DEBUG nova.network.neutron [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Updated VIF entry in instance network info cache for port 92d9c473-305a-4ffa-a953-e8d90fc1f84a. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2187.826997] env[67964]: DEBUG nova.network.neutron [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Updating instance_info_cache with network_info: [{"id": "92d9c473-305a-4ffa-a953-e8d90fc1f84a", "address": "fa:16:3e:2e:e2:8f", "network": {"id": "545a05d3-b8e2-435d-b1b5-1b6cb9a2d1ae", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1259553375-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83336cd0155c4286b66ac327ef1385b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92d9c473-30", "ovs_interfaceid": "92d9c473-305a-4ffa-a953-e8d90fc1f84a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2187.835869] env[67964]: DEBUG oslo_concurrency.lockutils [req-5cdc9838-f4f3-428a-bbd4-62ebb19c27d2 req-fe0d5e54-204b-4a4b-8ec7-3a94e8dcd317 service nova] Releasing lock "refresh_cache-1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2195.729407] env[67964]: WARNING oslo_vmware.rw_handles [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2195.729407] env[67964]: ERROR oslo_vmware.rw_handles [ 2195.730107] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2195.732869] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2195.733260] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Copying Virtual Disk [datastore1] vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/c365af80-de6d-488c-a915-7ae0fea6a779/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2195.733682] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bbbffaca-8f47-436b-b3e9-9e87bb39dd17 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2195.743352] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 2195.743352] env[67964]: value = "task-3456897" [ 2195.743352] env[67964]: _type = "Task" [ 2195.743352] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2195.752322] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': task-3456897, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2196.253446] env[67964]: DEBUG oslo_vmware.exceptions [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2196.253720] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2196.254277] env[67964]: ERROR nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2196.254277] env[67964]: Faults: ['InvalidArgument'] [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Traceback (most recent call last): [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] yield resources [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self.driver.spawn(context, instance, image_meta, [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self._fetch_image_if_missing(context, vi) [ 2196.254277] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] image_cache(vi, tmp_image_ds_loc) [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] vm_util.copy_virtual_disk( [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] session._wait_for_task(vmdk_copy_task) [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] return self.wait_for_task(task_ref) [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] return evt.wait() [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] result = hub.switch() [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2196.254612] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] return self.greenlet.switch() [ 2196.254910] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2196.254910] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self.f(*self.args, **self.kw) [ 2196.254910] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2196.254910] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] raise exceptions.translate_fault(task_info.error) [ 2196.254910] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2196.254910] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Faults: ['InvalidArgument'] [ 2196.254910] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] [ 2196.254910] env[67964]: INFO nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Terminating instance [ 2196.256108] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2196.256349] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2196.256587] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2639e712-27aa-4503-aa33-0d132911ec7b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.258693] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2196.258882] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2196.259605] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e6337b1-425c-4dfd-b077-3f330270196a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.266373] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2196.266586] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dba87dec-ff3b-455e-aa41-8e8189760402 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.268670] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2196.268840] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2196.269764] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95e0ee1b-ea1e-4578-9315-f0f9918aad14 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.274481] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Waiting for the task: (returnval){ [ 2196.274481] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52d19438-6e73-5db3-2e5b-14a58b632682" [ 2196.274481] env[67964]: _type = "Task" [ 2196.274481] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2196.281503] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52d19438-6e73-5db3-2e5b-14a58b632682, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2196.336144] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2196.336401] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2196.336612] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Deleting the datastore file [datastore1] bc98edf7-889e-4814-b859-d860033ba0cd {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2196.336869] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a5e4fe66-3f7f-4e63-b059-33ec31f90702 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.343554] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for the task: (returnval){ [ 2196.343554] env[67964]: value = "task-3456899" [ 2196.343554] env[67964]: _type = "Task" [ 2196.343554] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2196.351087] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': task-3456899, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2196.784990] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2196.785349] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Creating directory with path [datastore1] vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2196.785434] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c6adbb3e-3f1e-4352-9424-bb6d21af933d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.796692] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Created directory with path [datastore1] vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2196.796867] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Fetch image to [datastore1] vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2196.797042] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2196.797712] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65886f64-e9bd-4d9c-97a6-9b8ea3f84a44 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.803862] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29fb27c7-71cc-493c-9f19-31075de7925d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.812634] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-198d0b82-ea6c-4372-825b-42ae1694a03f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.842764] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4e69adf-3e83-4cff-9d2f-e5620ed12aea {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.852710] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1afc16f7-86d0-43e0-8f96-3a3813fade28 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.854286] env[67964]: DEBUG oslo_vmware.api [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Task: {'id': task-3456899, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078057} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2196.854508] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2196.854676] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2196.854840] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2196.855013] env[67964]: INFO nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2196.856991] env[67964]: DEBUG nova.compute.claims [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2196.857174] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2196.857401] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2196.877396] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2196.932408] env[67964]: DEBUG oslo_vmware.rw_handles [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2196.991025] env[67964]: DEBUG oslo_vmware.rw_handles [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2196.991221] env[67964]: DEBUG oslo_vmware.rw_handles [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2197.058551] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a5ccca2-3348-4a4a-87c0-8787e2e23267 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2197.065628] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4173395a-66a7-4726-b609-9798935eb11f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2197.094661] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05080d3a-691e-48c9-bdff-e400539b64f0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2197.101207] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9813334b-7a30-4822-9cbe-6760cca0d2cc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2197.113810] env[67964]: DEBUG nova.compute.provider_tree [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2197.121885] env[67964]: DEBUG nova.scheduler.client.report [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2197.134968] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.278s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2197.135524] env[67964]: ERROR nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2197.135524] env[67964]: Faults: ['InvalidArgument'] [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Traceback (most recent call last): [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self.driver.spawn(context, instance, image_meta, [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self._fetch_image_if_missing(context, vi) [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] image_cache(vi, tmp_image_ds_loc) [ 2197.135524] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] vm_util.copy_virtual_disk( [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] session._wait_for_task(vmdk_copy_task) [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] return self.wait_for_task(task_ref) [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] return evt.wait() [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] result = hub.switch() [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] return self.greenlet.switch() [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2197.135883] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] self.f(*self.args, **self.kw) [ 2197.136240] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2197.136240] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] raise exceptions.translate_fault(task_info.error) [ 2197.136240] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2197.136240] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Faults: ['InvalidArgument'] [ 2197.136240] env[67964]: ERROR nova.compute.manager [instance: bc98edf7-889e-4814-b859-d860033ba0cd] [ 2197.136240] env[67964]: DEBUG nova.compute.utils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2197.137683] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Build of instance bc98edf7-889e-4814-b859-d860033ba0cd was re-scheduled: A specified parameter was not correct: fileType [ 2197.137683] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2197.138073] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2197.138244] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2197.138414] env[67964]: DEBUG nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2197.138568] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2197.435860] env[67964]: DEBUG nova.network.neutron [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2197.447482] env[67964]: INFO nova.compute.manager [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Took 0.31 seconds to deallocate network for instance. [ 2197.550214] env[67964]: INFO nova.scheduler.client.report [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Deleted allocations for instance bc98edf7-889e-4814-b859-d860033ba0cd [ 2197.571769] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7bf9527a-bc53-4ff6-911b-9b9df69a5a91 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "bc98edf7-889e-4814-b859-d860033ba0cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 605.989s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2197.572019] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "bc98edf7-889e-4814-b859-d860033ba0cd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 423.845s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2197.572220] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] During sync_power_state the instance has a pending task (spawning). Skip. [ 2197.572394] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "bc98edf7-889e-4814-b859-d860033ba0cd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2197.572803] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "bc98edf7-889e-4814-b859-d860033ba0cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 410.030s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2197.572803] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Acquiring lock "bc98edf7-889e-4814-b859-d860033ba0cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2197.573030] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "bc98edf7-889e-4814-b859-d860033ba0cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2197.573183] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "bc98edf7-889e-4814-b859-d860033ba0cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2197.575209] env[67964]: INFO nova.compute.manager [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Terminating instance [ 2197.577018] env[67964]: DEBUG nova.compute.manager [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2197.577644] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2197.577644] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-78939232-b14a-4a84-90a6-6b2a9948dfe8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2197.587236] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c21b8883-31f1-490d-af55-4eafbdb31070 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2197.614251] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bc98edf7-889e-4814-b859-d860033ba0cd could not be found. [ 2197.614497] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2197.614679] env[67964]: INFO nova.compute.manager [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2197.614919] env[67964]: DEBUG oslo.service.loopingcall [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2197.615176] env[67964]: DEBUG nova.compute.manager [-] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2197.615274] env[67964]: DEBUG nova.network.neutron [-] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2197.639193] env[67964]: DEBUG nova.network.neutron [-] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2197.646551] env[67964]: INFO nova.compute.manager [-] [instance: bc98edf7-889e-4814-b859-d860033ba0cd] Took 0.03 seconds to deallocate network for instance. [ 2197.729700] env[67964]: DEBUG oslo_concurrency.lockutils [None req-ded58270-c5e8-4ceb-85ce-d6018a15efa6 tempest-AttachVolumeNegativeTest-222872873 tempest-AttachVolumeNegativeTest-222872873-project-member] Lock "bc98edf7-889e-4814-b859-d860033ba0cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.157s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2207.290654] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquiring lock "1d709fb2-2bfe-463c-b39c-06e4a31cb0de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2207.291009] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Lock "1d709fb2-2bfe-463c-b39c-06e4a31cb0de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2207.301340] env[67964]: DEBUG nova.compute.manager [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Starting instance... {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2207.352601] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2207.352898] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2207.354552] env[67964]: INFO nova.compute.claims [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2207.507761] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b84afed-6b85-445c-986a-268b327e411a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.516831] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b3df645-2360-4ba9-a172-894902865b25 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.545726] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd522c8c-115c-4487-93b9-b2b4006dab90 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.552927] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22bdd06b-e606-4bd8-b144-51a2788ac1da {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.565767] env[67964]: DEBUG nova.compute.provider_tree [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2207.577118] env[67964]: DEBUG nova.scheduler.client.report [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2207.600458] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2207.600954] env[67964]: DEBUG nova.compute.manager [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Start building networks asynchronously for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2207.640723] env[67964]: DEBUG nova.compute.utils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Using /dev/sd instead of None {{(pid=67964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2207.642763] env[67964]: DEBUG nova.compute.manager [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Allocating IP information in the background. {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 2207.642979] env[67964]: DEBUG nova.network.neutron [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] allocate_for_instance() {{(pid=67964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2207.651491] env[67964]: DEBUG nova.compute.manager [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Start building block device mappings for instance. {{(pid=67964) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2207.715831] env[67964]: DEBUG nova.policy [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1015d4d803fa4af58c5cd22ce6a9e3ee', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b91766f0116b4b7a9836a5cf09598387', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2207.719216] env[67964]: DEBUG nova.compute.manager [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Start spawning the instance on the hypervisor. {{(pid=67964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2207.744947] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T12:20:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T12:20:05Z,direct_url=,disk_format='vmdk',id=b261268a-9800-40a9-afde-85d61f8eed6a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='34fc8bdd38bd4d2781a21b19049364a0',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T12:20:06Z,virtual_size=,visibility=), allow threads: False {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2207.745209] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Flavor limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2207.745362] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Image limits 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2207.745539] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Flavor pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2207.745681] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Image pref 0:0:0 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2207.745868] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2207.746114] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2207.746274] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2207.746467] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Got 1 possible topologies {{(pid=67964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2207.746646] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2207.746822] env[67964]: DEBUG nova.virt.hardware [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2207.747693] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64db95cf-bad0-46b8-91e8-3be32f3f265b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.757088] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebf97e1c-5bee-418d-a250-821a8d89afbc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.036789] env[67964]: DEBUG nova.network.neutron [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Successfully created port: 09428844-1bcc-4896-a2a8-e4ea353e430c {{(pid=67964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2208.555919] env[67964]: DEBUG nova.compute.manager [req-2bbf3b01-7ae9-4877-ae8d-0a74afad42ad req-0ee30e84-779e-48b5-adb3-a9f75f3a3bda service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Received event network-vif-plugged-09428844-1bcc-4896-a2a8-e4ea353e430c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2208.556196] env[67964]: DEBUG oslo_concurrency.lockutils [req-2bbf3b01-7ae9-4877-ae8d-0a74afad42ad req-0ee30e84-779e-48b5-adb3-a9f75f3a3bda service nova] Acquiring lock "1d709fb2-2bfe-463c-b39c-06e4a31cb0de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2208.556355] env[67964]: DEBUG oslo_concurrency.lockutils [req-2bbf3b01-7ae9-4877-ae8d-0a74afad42ad req-0ee30e84-779e-48b5-adb3-a9f75f3a3bda service nova] Lock "1d709fb2-2bfe-463c-b39c-06e4a31cb0de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2208.556553] env[67964]: DEBUG oslo_concurrency.lockutils [req-2bbf3b01-7ae9-4877-ae8d-0a74afad42ad req-0ee30e84-779e-48b5-adb3-a9f75f3a3bda service nova] Lock "1d709fb2-2bfe-463c-b39c-06e4a31cb0de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2208.556718] env[67964]: DEBUG nova.compute.manager [req-2bbf3b01-7ae9-4877-ae8d-0a74afad42ad req-0ee30e84-779e-48b5-adb3-a9f75f3a3bda service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] No waiting events found dispatching network-vif-plugged-09428844-1bcc-4896-a2a8-e4ea353e430c {{(pid=67964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2208.556876] env[67964]: WARNING nova.compute.manager [req-2bbf3b01-7ae9-4877-ae8d-0a74afad42ad req-0ee30e84-779e-48b5-adb3-a9f75f3a3bda service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Received unexpected event network-vif-plugged-09428844-1bcc-4896-a2a8-e4ea353e430c for instance with vm_state building and task_state spawning. [ 2208.631730] env[67964]: DEBUG nova.network.neutron [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Successfully updated port: 09428844-1bcc-4896-a2a8-e4ea353e430c {{(pid=67964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2208.649320] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquiring lock "refresh_cache-1d709fb2-2bfe-463c-b39c-06e4a31cb0de" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2208.649539] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquired lock "refresh_cache-1d709fb2-2bfe-463c-b39c-06e4a31cb0de" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2208.649827] env[67964]: DEBUG nova.network.neutron [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2208.690447] env[67964]: DEBUG nova.network.neutron [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2208.884736] env[67964]: DEBUG nova.network.neutron [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Updating instance_info_cache with network_info: [{"id": "09428844-1bcc-4896-a2a8-e4ea353e430c", "address": "fa:16:3e:ee:3a:7e", "network": {"id": "100329aa-496f-4ce4-9cdf-d91d1be17bff", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-314219850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b91766f0116b4b7a9836a5cf09598387", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cba18f15-a919-422e-a423-1e705e233389", "external-id": "nsx-vlan-transportzone-79", "segmentation_id": 79, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09428844-1b", "ovs_interfaceid": "09428844-1bcc-4896-a2a8-e4ea353e430c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2208.896137] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Releasing lock "refresh_cache-1d709fb2-2bfe-463c-b39c-06e4a31cb0de" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2208.896410] env[67964]: DEBUG nova.compute.manager [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Instance network_info: |[{"id": "09428844-1bcc-4896-a2a8-e4ea353e430c", "address": "fa:16:3e:ee:3a:7e", "network": {"id": "100329aa-496f-4ce4-9cdf-d91d1be17bff", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-314219850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b91766f0116b4b7a9836a5cf09598387", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cba18f15-a919-422e-a423-1e705e233389", "external-id": "nsx-vlan-transportzone-79", "segmentation_id": 79, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09428844-1b", "ovs_interfaceid": "09428844-1bcc-4896-a2a8-e4ea353e430c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2208.896805] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ee:3a:7e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cba18f15-a919-422e-a423-1e705e233389', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '09428844-1bcc-4896-a2a8-e4ea353e430c', 'vif_model': 'vmxnet3'}] {{(pid=67964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2208.904203] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Creating folder: Project (b91766f0116b4b7a9836a5cf09598387). Parent ref: group-v690366. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2208.904694] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-99bb1788-75f5-45db-ba4f-a317a83e5251 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.914368] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Created folder: Project (b91766f0116b4b7a9836a5cf09598387) in parent group-v690366. [ 2208.914551] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Creating folder: Instances. Parent ref: group-v690478. {{(pid=67964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2208.914759] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92be614e-9c03-4557-891a-00d206966ee1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.922611] env[67964]: INFO nova.virt.vmwareapi.vm_util [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Created folder: Instances in parent group-v690478. [ 2208.922819] env[67964]: DEBUG oslo.service.loopingcall [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2208.922984] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Creating VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2208.923174] env[67964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-531a82f9-30a2-4537-8ad0-c11b2d453a8a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.940558] env[67964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2208.940558] env[67964]: value = "task-3456902" [ 2208.940558] env[67964]: _type = "Task" [ 2208.940558] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2208.947491] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456902, 'name': CreateVM_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2209.450527] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456902, 'name': CreateVM_Task} progress is 99%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2209.951497] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456902, 'name': CreateVM_Task} progress is 99%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2210.451995] env[67964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456902, 'name': CreateVM_Task, 'duration_secs': 1.278705} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2210.452183] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Created VM on the ESX host {{(pid=67964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2210.452828] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2210.452988] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2210.453316] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2210.453559] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2eb2ddff-2de2-4baa-b24a-60aabfdbef8d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.457870] env[67964]: DEBUG oslo_vmware.api [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Waiting for the task: (returnval){ [ 2210.457870] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52be37c3-f226-b24d-87b2-182d42a2a5e9" [ 2210.457870] env[67964]: _type = "Task" [ 2210.457870] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2210.464952] env[67964]: DEBUG oslo_vmware.api [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52be37c3-f226-b24d-87b2-182d42a2a5e9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2210.586079] env[67964]: DEBUG nova.compute.manager [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Received event network-changed-09428844-1bcc-4896-a2a8-e4ea353e430c {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2210.586292] env[67964]: DEBUG nova.compute.manager [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Refreshing instance network info cache due to event network-changed-09428844-1bcc-4896-a2a8-e4ea353e430c. {{(pid=67964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2210.586554] env[67964]: DEBUG oslo_concurrency.lockutils [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] Acquiring lock "refresh_cache-1d709fb2-2bfe-463c-b39c-06e4a31cb0de" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2210.586708] env[67964]: DEBUG oslo_concurrency.lockutils [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] Acquired lock "refresh_cache-1d709fb2-2bfe-463c-b39c-06e4a31cb0de" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2210.586866] env[67964]: DEBUG nova.network.neutron [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Refreshing network info cache for port 09428844-1bcc-4896-a2a8-e4ea353e430c {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2210.818013] env[67964]: DEBUG nova.network.neutron [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Updated VIF entry in instance network info cache for port 09428844-1bcc-4896-a2a8-e4ea353e430c. {{(pid=67964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2210.818374] env[67964]: DEBUG nova.network.neutron [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Updating instance_info_cache with network_info: [{"id": "09428844-1bcc-4896-a2a8-e4ea353e430c", "address": "fa:16:3e:ee:3a:7e", "network": {"id": "100329aa-496f-4ce4-9cdf-d91d1be17bff", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-314219850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b91766f0116b4b7a9836a5cf09598387", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cba18f15-a919-422e-a423-1e705e233389", "external-id": "nsx-vlan-transportzone-79", "segmentation_id": 79, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09428844-1b", "ovs_interfaceid": "09428844-1bcc-4896-a2a8-e4ea353e430c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2210.828821] env[67964]: DEBUG oslo_concurrency.lockutils [req-ae6f02ff-9542-4476-a125-7071bf303116 req-0fbfb1f2-58b6-4b7c-9ddd-dc55885e70e7 service nova] Releasing lock "refresh_cache-1d709fb2-2bfe-463c-b39c-06e4a31cb0de" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2210.969343] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2210.969638] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Processing image b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2210.969779] env[67964]: DEBUG oslo_concurrency.lockutils [None req-3d7f8e29-b470-48f6-9f83-6c54bc9b950b tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2230.801070] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2230.813703] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2230.813903] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2230.814094] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2230.814253] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2230.815382] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1918808-4f26-4db8-b4b4-9ba06624d597 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.824214] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4239c434-ef7e-423b-9292-0e82f323fce0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.837707] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dcadea2-57e0-454b-92bc-be67cf6a9490 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.843734] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba4e271b-b769-42b7-a574-58b1dc24a546 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.871379] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180855MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2230.871521] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2230.871692] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2230.936679] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance c01bc11b-384e-418e-be43-e12d0a845a24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.936865] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.936998] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.937148] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.937268] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.937385] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2a0e1c08-8201-4ed7-9072-fdd90f25f120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.937562] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.937703] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1d709fb2-2bfe-463c-b39c-06e4a31cb0de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2230.937881] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2230.938027] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2231.036215] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ba517bc-d374-4f1f-9d6c-a1152edfe7b8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2231.043645] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada0f428-8559-4235-a8e5-6aa94145b6e1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2231.071863] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccf90665-d7f1-4733-b5c1-d7ae5e280482 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2231.078429] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecaee012-4981-4dc8-8251-ed830fd05783 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2231.090830] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2231.098843] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2231.112449] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2231.112625] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2236.672639] env[67964]: DEBUG oslo_concurrency.lockutils [None req-9ce6af09-ace3-457d-9e00-6bcdc052453e tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "2a0e1c08-8201-4ed7-9072-fdd90f25f120" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2239.113615] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2239.797073] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2239.799609] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2240.800236] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2240.800595] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2241.801222] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2241.801588] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2242.801374] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2242.801731] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2242.801731] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2242.819144] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.819289] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.819429] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.819547] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.819667] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.819786] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.819901] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.820148] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2242.820338] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2242.895241] env[67964]: WARNING oslo_vmware.rw_handles [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2242.895241] env[67964]: ERROR oslo_vmware.rw_handles [ 2242.895724] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2242.897728] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2242.897973] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Copying Virtual Disk [datastore1] vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/a1ba305f-d854-4d4c-b4d8-31f01ff307a7/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2242.898249] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-572b2671-33aa-419c-ab7a-08831d470a63 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.907079] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Waiting for the task: (returnval){ [ 2242.907079] env[67964]: value = "task-3456903" [ 2242.907079] env[67964]: _type = "Task" [ 2242.907079] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2242.914791] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Task: {'id': task-3456903, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2243.417785] env[67964]: DEBUG oslo_vmware.exceptions [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2243.418087] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2243.418692] env[67964]: ERROR nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2243.418692] env[67964]: Faults: ['InvalidArgument'] [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Traceback (most recent call last): [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] yield resources [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self.driver.spawn(context, instance, image_meta, [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._fetch_image_if_missing(context, vi) [ 2243.418692] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] image_cache(vi, tmp_image_ds_loc) [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] vm_util.copy_virtual_disk( [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] session._wait_for_task(vmdk_copy_task) [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.wait_for_task(task_ref) [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return evt.wait() [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] result = hub.switch() [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2243.419026] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.greenlet.switch() [ 2243.419715] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2243.419715] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self.f(*self.args, **self.kw) [ 2243.419715] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2243.419715] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] raise exceptions.translate_fault(task_info.error) [ 2243.419715] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2243.419715] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Faults: ['InvalidArgument'] [ 2243.419715] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] [ 2243.419715] env[67964]: INFO nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Terminating instance [ 2243.420640] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2243.420854] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2243.421112] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d5034b77-c278-47ef-bdd5-4464eabb0775 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.424645] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2243.424826] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquired lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2243.424985] env[67964]: DEBUG nova.network.neutron [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2243.428668] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2243.428834] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2243.429553] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6c930fd7-2b2e-419a-abbc-4df15c6ba270 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.436850] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for the task: (returnval){ [ 2243.436850] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]529d21ed-c74b-e46b-3f11-a097be696551" [ 2243.436850] env[67964]: _type = "Task" [ 2243.436850] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2243.445840] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]529d21ed-c74b-e46b-3f11-a097be696551, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2243.452798] env[67964]: DEBUG nova.network.neutron [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2243.514680] env[67964]: DEBUG nova.network.neutron [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2243.523756] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Releasing lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2243.524176] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2243.524368] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2243.525391] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d7e3aed-49a8-4eeb-a5ee-e160217ad3df {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.532747] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2243.533172] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a85152c0-102e-4f57-98c6-f99879ff7358 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.561246] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2243.561445] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2243.561616] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Deleting the datastore file [datastore1] c01bc11b-384e-418e-be43-e12d0a845a24 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2243.561841] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-33ab420d-075e-4a1a-b73c-7aa401bb59f5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.567324] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Waiting for the task: (returnval){ [ 2243.567324] env[67964]: value = "task-3456905" [ 2243.567324] env[67964]: _type = "Task" [ 2243.567324] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2243.574584] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Task: {'id': task-3456905, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2243.799617] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2243.947474] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2243.947753] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating directory with path [datastore1] vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2243.947995] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2210baae-3302-4c4d-919e-071cf8b5b938 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.959273] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Created directory with path [datastore1] vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2243.959443] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Fetch image to [datastore1] vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2243.959604] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2243.960308] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c083bb7f-3bff-4e2b-b589-135f9306f713 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.966565] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40fa6980-d204-4e14-88a1-a76eed42f392 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2243.975129] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-918f0055-291a-4ac2-957f-3b3ecb4c65d0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.005231] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d63e7a33-d238-4dc3-8f8e-e0c517345a77 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.010512] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e7f4b1db-f7d0-4480-97ee-a482f22e2579 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.029681] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2244.075872] env[67964]: DEBUG oslo_vmware.api [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Task: {'id': task-3456905, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042781} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2244.076138] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2244.076326] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2244.076492] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2244.076660] env[67964]: INFO nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Took 0.55 seconds to destroy the instance on the hypervisor. [ 2244.076894] env[67964]: DEBUG oslo.service.loopingcall [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2244.077132] env[67964]: DEBUG nova.compute.manager [-] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network deallocation for instance since networking was not requested. {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 2244.079211] env[67964]: DEBUG oslo_vmware.rw_handles [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2244.080722] env[67964]: DEBUG nova.compute.claims [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2244.080887] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2244.081117] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2244.141289] env[67964]: DEBUG oslo_vmware.rw_handles [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2244.141449] env[67964]: DEBUG oslo_vmware.rw_handles [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2244.236021] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe154ec7-8191-4af4-969e-0929e1155e5a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.243219] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbb9bc63-4aba-4a82-a8be-cb260466c6f6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.272330] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f03ab25-27a2-45b5-a269-940f89738286 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.278965] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7845ad9-fd2f-4040-9e23-c00b39dba9bf {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.292418] env[67964]: DEBUG nova.compute.provider_tree [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2244.300823] env[67964]: DEBUG nova.scheduler.client.report [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2244.313754] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.233s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2244.314315] env[67964]: ERROR nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2244.314315] env[67964]: Faults: ['InvalidArgument'] [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Traceback (most recent call last): [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self.driver.spawn(context, instance, image_meta, [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._fetch_image_if_missing(context, vi) [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] image_cache(vi, tmp_image_ds_loc) [ 2244.314315] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] vm_util.copy_virtual_disk( [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] session._wait_for_task(vmdk_copy_task) [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.wait_for_task(task_ref) [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return evt.wait() [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] result = hub.switch() [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.greenlet.switch() [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2244.314655] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self.f(*self.args, **self.kw) [ 2244.315043] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2244.315043] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] raise exceptions.translate_fault(task_info.error) [ 2244.315043] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2244.315043] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Faults: ['InvalidArgument'] [ 2244.315043] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] [ 2244.315193] env[67964]: DEBUG nova.compute.utils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2244.316531] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Build of instance c01bc11b-384e-418e-be43-e12d0a845a24 was re-scheduled: A specified parameter was not correct: fileType [ 2244.316531] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2244.316902] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2244.317158] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2244.317318] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquired lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2244.317473] env[67964]: DEBUG nova.network.neutron [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2244.340032] env[67964]: DEBUG nova.network.neutron [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2244.397751] env[67964]: DEBUG nova.network.neutron [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2244.405798] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Releasing lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2244.405971] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2244.406167] env[67964]: DEBUG nova.compute.manager [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Skipping network deallocation for instance since networking was not requested. {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 2244.487278] env[67964]: INFO nova.scheduler.client.report [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Deleted allocations for instance c01bc11b-384e-418e-be43-e12d0a845a24 [ 2244.503916] env[67964]: DEBUG oslo_concurrency.lockutils [None req-4f6fcb1b-4eea-4760-90d1-744cd2dcfadd tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "c01bc11b-384e-418e-be43-e12d0a845a24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 617.697s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2244.504203] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "c01bc11b-384e-418e-be43-e12d0a845a24" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 470.777s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2244.504391] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] During sync_power_state the instance has a pending task (spawning). Skip. [ 2244.504559] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "c01bc11b-384e-418e-be43-e12d0a845a24" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2244.504805] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "c01bc11b-384e-418e-be43-e12d0a845a24" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 421.997s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2244.505025] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "c01bc11b-384e-418e-be43-e12d0a845a24-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2244.505234] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "c01bc11b-384e-418e-be43-e12d0a845a24-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2244.505640] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "c01bc11b-384e-418e-be43-e12d0a845a24-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2244.507726] env[67964]: INFO nova.compute.manager [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Terminating instance [ 2244.509380] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquiring lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2244.509599] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Acquired lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2244.509853] env[67964]: DEBUG nova.network.neutron [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Building network info cache for instance {{(pid=67964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2244.543402] env[67964]: DEBUG nova.network.neutron [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance cache missing network info. {{(pid=67964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2244.603485] env[67964]: DEBUG nova.network.neutron [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2244.612725] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Releasing lock "refresh_cache-c01bc11b-384e-418e-be43-e12d0a845a24" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2244.613141] env[67964]: DEBUG nova.compute.manager [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2244.613351] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2244.613844] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cc23ae63-79bb-4a93-90cf-b8b99851f2e3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.622975] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42504909-3857-467d-a1f2-03447d4bc5eb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2244.649724] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c01bc11b-384e-418e-be43-e12d0a845a24 could not be found. [ 2244.649900] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2244.650084] env[67964]: INFO nova.compute.manager [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2244.650314] env[67964]: DEBUG oslo.service.loopingcall [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2244.650518] env[67964]: DEBUG nova.compute.manager [-] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2244.650612] env[67964]: DEBUG nova.network.neutron [-] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2244.768484] env[67964]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67964) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 2244.768759] env[67964]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-5851bd7c-d7ac-4a3b-9dba-f6a7406d8278'] [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2244.769345] env[67964]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall self._deallocate_network( [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2244.769790] env[67964]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2244.770449] env[67964]: ERROR oslo.service.loopingcall [ 2244.770800] env[67964]: ERROR nova.compute.manager [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2244.801171] env[67964]: ERROR nova.compute.manager [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Traceback (most recent call last): [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] ret = obj(*args, **kwargs) [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] exception_handler_v20(status_code, error_body) [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] raise client_exc(message=error_message, [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Neutron server returns request_ids: ['req-5851bd7c-d7ac-4a3b-9dba-f6a7406d8278'] [ 2244.801171] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] During handling of the above exception, another exception occurred: [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Traceback (most recent call last): [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._delete_instance(context, instance, bdms) [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._shutdown_instance(context, instance, bdms) [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._try_deallocate_network(context, instance, requested_networks) [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] with excutils.save_and_reraise_exception(): [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2244.801509] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self.force_reraise() [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] raise self.value [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] _deallocate_network_with_retries() [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return evt.wait() [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] result = hub.switch() [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.greenlet.switch() [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2244.801810] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] result = func(*self.args, **self.kw) [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] result = f(*args, **kwargs) [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._deallocate_network( [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self.network_api.deallocate_for_instance( [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] data = neutron.list_ports(**search_opts) [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] ret = obj(*args, **kwargs) [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.list('ports', self.ports_path, retrieve_all, [ 2244.802088] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] ret = obj(*args, **kwargs) [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] for r in self._pagination(collection, path, **params): [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] res = self.get(path, params=params) [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] ret = obj(*args, **kwargs) [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.retry_request("GET", action, body=body, [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] ret = obj(*args, **kwargs) [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2244.802385] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] return self.do_request(method, action, body=body, [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] ret = obj(*args, **kwargs) [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] self._handle_fault_response(status_code, replybody, resp) [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2244.802720] env[67964]: ERROR nova.compute.manager [instance: c01bc11b-384e-418e-be43-e12d0a845a24] [ 2244.827129] env[67964]: DEBUG oslo_concurrency.lockutils [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Lock "c01bc11b-384e-418e-be43-e12d0a845a24" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.322s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2244.873089] env[67964]: INFO nova.compute.manager [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] [instance: c01bc11b-384e-418e-be43-e12d0a845a24] Successfully reverted task state from None on failure for instance. [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server [None req-71147913-1053-4c52-9189-2c6a26f48ef8 tempest-ServersAaction247Test-573411585 tempest-ServersAaction247Test-573411585-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-5851bd7c-d7ac-4a3b-9dba-f6a7406d8278'] [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 2244.876392] env[67964]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2244.877157] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 2244.877849] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server raise self.value [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 2244.878533] env[67964]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server return evt.wait() [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.879155] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2244.879567] env[67964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2244.879950] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2244.879950] env[67964]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 2244.879950] env[67964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2244.879950] env[67964]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2244.879950] env[67964]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2244.879950] env[67964]: ERROR oslo_messaging.rpc.server [ 2258.795691] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2291.963018] env[67964]: WARNING oslo_vmware.rw_handles [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2291.963018] env[67964]: ERROR oslo_vmware.rw_handles [ 2291.963018] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2291.964867] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2291.965151] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Copying Virtual Disk [datastore1] vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/a2787e62-36c7-40ff-b24a-27486cac4b97/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2291.965499] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-559aa77e-0bd4-4e22-856f-8610774e23f3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.973744] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for the task: (returnval){ [ 2291.973744] env[67964]: value = "task-3456906" [ 2291.973744] env[67964]: _type = "Task" [ 2291.973744] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2291.982115] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Task: {'id': task-3456906, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2292.484589] env[67964]: DEBUG oslo_vmware.exceptions [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2292.484969] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2292.485538] env[67964]: ERROR nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2292.485538] env[67964]: Faults: ['InvalidArgument'] [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Traceback (most recent call last): [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] yield resources [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self.driver.spawn(context, instance, image_meta, [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self._fetch_image_if_missing(context, vi) [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2292.485538] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] image_cache(vi, tmp_image_ds_loc) [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] vm_util.copy_virtual_disk( [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] session._wait_for_task(vmdk_copy_task) [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] return self.wait_for_task(task_ref) [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] return evt.wait() [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] result = hub.switch() [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] return self.greenlet.switch() [ 2292.485935] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2292.486276] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self.f(*self.args, **self.kw) [ 2292.486276] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2292.486276] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] raise exceptions.translate_fault(task_info.error) [ 2292.486276] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2292.486276] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Faults: ['InvalidArgument'] [ 2292.486276] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] [ 2292.486276] env[67964]: INFO nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Terminating instance [ 2292.487421] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2292.487657] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2292.487895] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d787bbf2-5171-481f-9ae8-45148ea28d4e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.490109] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2292.490306] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2292.491007] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c49e22a1-28cc-415e-a650-00d898803a17 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.497750] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2292.498059] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f3d67f8e-c340-4e72-8e34-8d29911fb1fc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.500121] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2292.500296] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2292.501262] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c5a1cb8-bf41-444e-ab59-ce8092b2e1a6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.506370] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 2292.506370] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5296e27a-bcd5-76e6-f651-de67cdb7ed91" [ 2292.506370] env[67964]: _type = "Task" [ 2292.506370] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2292.514386] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]5296e27a-bcd5-76e6-f651-de67cdb7ed91, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2292.563299] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2292.563516] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2292.563695] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Deleting the datastore file [datastore1] 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2292.564012] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d805048f-c56c-4fc5-828d-4dba53134624 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.570188] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for the task: (returnval){ [ 2292.570188] env[67964]: value = "task-3456908" [ 2292.570188] env[67964]: _type = "Task" [ 2292.570188] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2292.577857] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Task: {'id': task-3456908, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2292.800364] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2292.812519] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2292.812747] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2292.812925] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2292.813098] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2292.814201] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e146afc4-7361-4a23-86e2-f7a847dd822a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.822370] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9a4da1c-fdba-4505-8ff8-1873c376c311 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.835785] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-749c4225-af54-45fa-9314-7efbc4ebcdf1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.841627] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc5ed095-a576-4420-adbe-ccd495df1575 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.871195] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180849MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2292.871314] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2292.871473] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2292.935873] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.936063] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 07489f39-f57c-4528-80b8-b42056181b8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.936197] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.936319] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.936438] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2a0e1c08-8201-4ed7-9072-fdd90f25f120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.936555] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.936668] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1d709fb2-2bfe-463c-b39c-06e4a31cb0de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.936857] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2292.936993] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2293.017993] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2293.017993] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating directory with path [datastore1] vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2293.019176] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d1dbd3ae-c179-4107-ad22-fa4d2802349c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.026658] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a64c9402-fcbd-48e4-9af2-877ad94692a4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.029922] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Created directory with path [datastore1] vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2293.030111] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Fetch image to [datastore1] vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2293.030280] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2293.031255] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29e289d7-571e-4bde-a55f-a5538a5e8b84 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.035962] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-649ec256-d214-4d89-9923-2c01990e6517 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.041484] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99265898-65ea-481c-8643-89604fe24bb9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.068294] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9fefaaf-8870-4a69-b07a-ccf45149d1c4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.078802] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a2b9ca-4a55-4226-b286-4fb1caa2df0e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.086885] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-702d3439-468b-44cc-ac8a-f7fd88ba0334 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.094645] env[67964]: DEBUG oslo_vmware.api [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Task: {'id': task-3456908, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067565} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2293.095514] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2293.095721] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2293.095942] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2293.096136] env[67964]: INFO nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2293.127390] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2293.128845] env[67964]: DEBUG nova.compute.claims [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2293.129108] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2293.130082] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02a48a66-5819-4731-9b7e-5d2e55bdcb52 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.137310] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-176bfa5f-f2bc-4821-beb7-be7cfd0fce52 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.139504] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2293.153905] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2293.154112] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2293.154368] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.025s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2293.160781] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2293.213651] env[67964]: DEBUG oslo_vmware.rw_handles [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2293.271799] env[67964]: DEBUG oslo_vmware.rw_handles [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2293.271994] env[67964]: DEBUG oslo_vmware.rw_handles [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2293.329696] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92c77c56-5d3c-4a6e-85d5-53992bed8a61 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.337289] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a0af6b7-410e-4337-a5bc-263ec0dd0177 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.366773] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88eab1d9-ef5b-4495-a571-0a69ef06f21f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.373741] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bd6dc91-cdef-4bdc-8364-60b4c1560091 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.386261] env[67964]: DEBUG nova.compute.provider_tree [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2293.394775] env[67964]: DEBUG nova.scheduler.client.report [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2293.408032] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.254s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2293.408545] env[67964]: ERROR nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2293.408545] env[67964]: Faults: ['InvalidArgument'] [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Traceback (most recent call last): [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self.driver.spawn(context, instance, image_meta, [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self._fetch_image_if_missing(context, vi) [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] image_cache(vi, tmp_image_ds_loc) [ 2293.408545] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] vm_util.copy_virtual_disk( [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] session._wait_for_task(vmdk_copy_task) [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] return self.wait_for_task(task_ref) [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] return evt.wait() [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] result = hub.switch() [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] return self.greenlet.switch() [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2293.408928] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] self.f(*self.args, **self.kw) [ 2293.409256] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2293.409256] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] raise exceptions.translate_fault(task_info.error) [ 2293.409256] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2293.409256] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Faults: ['InvalidArgument'] [ 2293.409256] env[67964]: ERROR nova.compute.manager [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] [ 2293.409256] env[67964]: DEBUG nova.compute.utils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2293.410598] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Build of instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a was re-scheduled: A specified parameter was not correct: fileType [ 2293.410598] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2293.410962] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2293.411152] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2293.411323] env[67964]: DEBUG nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2293.411480] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2293.730725] env[67964]: DEBUG nova.network.neutron [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2293.741628] env[67964]: INFO nova.compute.manager [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Took 0.33 seconds to deallocate network for instance. [ 2293.837015] env[67964]: INFO nova.scheduler.client.report [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Deleted allocations for instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a [ 2293.861375] env[67964]: DEBUG oslo_concurrency.lockutils [None req-7064a51a-ca55-481e-9177-d78f6c17a85f tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 645.260s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2293.861647] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 449.269s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2293.861877] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquiring lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2293.862096] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2293.862281] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2293.864225] env[67964]: INFO nova.compute.manager [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Terminating instance [ 2293.867076] env[67964]: DEBUG nova.compute.manager [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2293.867274] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2293.867533] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e3996b6d-3506-4fa6-bb89-53e41c28c3be {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.876287] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1763aff6-a93f-46e7-a176-b7566df03c0c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2293.903884] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a could not be found. [ 2293.904066] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2293.904246] env[67964]: INFO nova.compute.manager [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2293.904480] env[67964]: DEBUG oslo.service.loopingcall [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2293.904700] env[67964]: DEBUG nova.compute.manager [-] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2293.904799] env[67964]: DEBUG nova.network.neutron [-] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2293.927911] env[67964]: DEBUG nova.network.neutron [-] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2293.935692] env[67964]: INFO nova.compute.manager [-] [instance: 78e4a99a-35a1-4ad8-91f0-97f0e2a1641a] Took 0.03 seconds to deallocate network for instance. [ 2294.027829] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e5e5a7ff-dc2d-4fd0-90c6-eb6216f3bb71 tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Lock "78e4a99a-35a1-4ad8-91f0-97f0e2a1641a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2299.158307] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2300.796833] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2300.799513] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2301.801333] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2301.801674] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2303.800155] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2303.800448] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2304.800988] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2304.801366] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2304.801366] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2304.817407] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2304.817570] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2304.817648] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2304.817798] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2304.817939] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2304.818083] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2304.818206] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2304.818666] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2340.750466] env[67964]: WARNING oslo_vmware.rw_handles [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2340.750466] env[67964]: ERROR oslo_vmware.rw_handles [ 2340.751190] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2340.753028] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2340.753282] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Copying Virtual Disk [datastore1] vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/50e72259-b0e5-4fc7-ae46-ea383e8b4335/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2340.753581] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b1310e1f-41fa-4964-8f6c-7b25a3ecf82e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.761237] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 2340.761237] env[67964]: value = "task-3456909" [ 2340.761237] env[67964]: _type = "Task" [ 2340.761237] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2340.768982] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': task-3456909, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2341.271389] env[67964]: DEBUG oslo_vmware.exceptions [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2341.271666] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2341.272222] env[67964]: ERROR nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2341.272222] env[67964]: Faults: ['InvalidArgument'] [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Traceback (most recent call last): [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] yield resources [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self.driver.spawn(context, instance, image_meta, [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self._fetch_image_if_missing(context, vi) [ 2341.272222] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] image_cache(vi, tmp_image_ds_loc) [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] vm_util.copy_virtual_disk( [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] session._wait_for_task(vmdk_copy_task) [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] return self.wait_for_task(task_ref) [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] return evt.wait() [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] result = hub.switch() [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2341.272578] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] return self.greenlet.switch() [ 2341.272887] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2341.272887] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self.f(*self.args, **self.kw) [ 2341.272887] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2341.272887] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] raise exceptions.translate_fault(task_info.error) [ 2341.272887] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2341.272887] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Faults: ['InvalidArgument'] [ 2341.272887] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] [ 2341.272887] env[67964]: INFO nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Terminating instance [ 2341.274014] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2341.274224] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2341.274464] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b736b346-06e5-49c5-b835-a0f98bdef657 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.276762] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2341.276952] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2341.277663] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ead4a8e-6041-415a-b916-3964e203072b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.284310] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2341.284532] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f2d43d73-3215-42e1-8dfb-82622a3aaecc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.286631] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2341.286796] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2341.287743] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c938df08-46f7-43d6-a932-d122b6845b6e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.292484] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 2341.292484] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52f28cd9-51c4-0f02-f801-20dcfd897f34" [ 2341.292484] env[67964]: _type = "Task" [ 2341.292484] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2341.299019] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52f28cd9-51c4-0f02-f801-20dcfd897f34, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2341.350165] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2341.350362] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2341.350534] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Deleting the datastore file [datastore1] 07489f39-f57c-4528-80b8-b42056181b8b {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2341.350780] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f233e8d6-15cf-4a92-89fa-7431b31388ba {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.356786] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for the task: (returnval){ [ 2341.356786] env[67964]: value = "task-3456911" [ 2341.356786] env[67964]: _type = "Task" [ 2341.356786] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2341.364096] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': task-3456911, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2341.803391] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2341.803732] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating directory with path [datastore1] vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2341.803824] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-abcffc8f-6cfe-4ad9-8c47-34efe43518e1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.814985] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Created directory with path [datastore1] vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2341.815178] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Fetch image to [datastore1] vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2341.815353] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2341.816073] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33bdf91-153c-4f9f-a25e-3eb028f89dc2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.823477] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-301b7392-97e1-4d73-b684-7c7114df1f3e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.832060] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7065d88-154b-45c3-882b-3014bbbbf1ae {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.863611] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3766501b-2df0-45f6-8bcc-038fa182cdda {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.870133] env[67964]: DEBUG oslo_vmware.api [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Task: {'id': task-3456911, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.107624} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2341.871455] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2341.871643] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2341.871811] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2341.871981] env[67964]: INFO nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2341.873683] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b954ca9b-e1ae-47fa-8452-abc158776bc3 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.875463] env[67964]: DEBUG nova.compute.claims [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2341.875632] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2341.875844] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.896459] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2341.949653] env[67964]: DEBUG oslo_vmware.rw_handles [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2342.008976] env[67964]: DEBUG oslo_vmware.rw_handles [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2342.009177] env[67964]: DEBUG oslo_vmware.rw_handles [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2342.051474] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7c3808e-f93c-4498-b687-b50cf95a00d0 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2342.058979] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-141fd5db-9dad-480e-8d23-188cfcafb060 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2342.087979] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22780643-55fb-45ae-897f-aec5091f7c98 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2342.094773] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-258efba2-d4c3-4c07-9966-dfd1aef0780b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2342.107348] env[67964]: DEBUG nova.compute.provider_tree [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2342.115340] env[67964]: DEBUG nova.scheduler.client.report [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2342.129485] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.254s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2342.129985] env[67964]: ERROR nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2342.129985] env[67964]: Faults: ['InvalidArgument'] [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Traceback (most recent call last): [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self.driver.spawn(context, instance, image_meta, [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self._fetch_image_if_missing(context, vi) [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] image_cache(vi, tmp_image_ds_loc) [ 2342.129985] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] vm_util.copy_virtual_disk( [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] session._wait_for_task(vmdk_copy_task) [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] return self.wait_for_task(task_ref) [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] return evt.wait() [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] result = hub.switch() [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] return self.greenlet.switch() [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2342.130296] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] self.f(*self.args, **self.kw) [ 2342.130589] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2342.130589] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] raise exceptions.translate_fault(task_info.error) [ 2342.130589] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2342.130589] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Faults: ['InvalidArgument'] [ 2342.130589] env[67964]: ERROR nova.compute.manager [instance: 07489f39-f57c-4528-80b8-b42056181b8b] [ 2342.130700] env[67964]: DEBUG nova.compute.utils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2342.131962] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Build of instance 07489f39-f57c-4528-80b8-b42056181b8b was re-scheduled: A specified parameter was not correct: fileType [ 2342.131962] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2342.132354] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2342.132526] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2342.132692] env[67964]: DEBUG nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2342.132847] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2342.388843] env[67964]: DEBUG nova.network.neutron [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2342.402334] env[67964]: INFO nova.compute.manager [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Took 0.27 seconds to deallocate network for instance. [ 2342.488682] env[67964]: INFO nova.scheduler.client.report [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Deleted allocations for instance 07489f39-f57c-4528-80b8-b42056181b8b [ 2342.514274] env[67964]: DEBUG oslo_concurrency.lockutils [None req-83d07e82-b377-4051-b09f-7a31ce3e769c tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "07489f39-f57c-4528-80b8-b42056181b8b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 550.750s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2342.514444] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "07489f39-f57c-4528-80b8-b42056181b8b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 354.965s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2342.514580] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "07489f39-f57c-4528-80b8-b42056181b8b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2342.514775] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "07489f39-f57c-4528-80b8-b42056181b8b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2342.514941] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "07489f39-f57c-4528-80b8-b42056181b8b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2342.516866] env[67964]: INFO nova.compute.manager [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Terminating instance [ 2342.518581] env[67964]: DEBUG nova.compute.manager [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2342.518775] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2342.519533] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-843b5f3e-315c-49d1-b5b3-22819b754f61 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2342.528103] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-315bd625-b1b0-443a-8b64-e6b45548a69c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2342.554356] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 07489f39-f57c-4528-80b8-b42056181b8b could not be found. [ 2342.554590] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2342.554770] env[67964]: INFO nova.compute.manager [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2342.555021] env[67964]: DEBUG oslo.service.loopingcall [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2342.555526] env[67964]: DEBUG nova.compute.manager [-] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2342.555633] env[67964]: DEBUG nova.network.neutron [-] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2342.580592] env[67964]: DEBUG nova.network.neutron [-] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2342.588359] env[67964]: INFO nova.compute.manager [-] [instance: 07489f39-f57c-4528-80b8-b42056181b8b] Took 0.03 seconds to deallocate network for instance. [ 2342.687011] env[67964]: DEBUG oslo_concurrency.lockutils [None req-8faea3a4-d2ee-4dd8-ba5c-54011e3622fc tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Lock "07489f39-f57c-4528-80b8-b42056181b8b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.173s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2353.801052] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2353.812647] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2353.812864] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2353.813037] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2353.813196] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2353.814322] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cdf4357-cae4-485f-baec-14fb73af2065 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.822645] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01f3f537-3215-46b7-ae0e-adc1e9977602 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.836168] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c17ed30a-a69a-4d6e-86bc-3080fe9e3a78 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.842246] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce3e2ba1-ee4e-470a-92a6-e1953e91c57f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.870403] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180881MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2353.870861] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2353.870861] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2353.923525] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2353.923678] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2353.923804] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2a0e1c08-8201-4ed7-9072-fdd90f25f120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2353.923923] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2353.924052] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1d709fb2-2bfe-463c-b39c-06e4a31cb0de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2353.924229] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2353.924367] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2353.991445] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57313c6e-fce6-4095-8aa7-7b9c4fd343a6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.998578] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f416e2a-8cf3-4936-b52b-a3d30ab8cefd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.029669] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-559bcf48-f40b-4a08-b4ac-a1b7014d0392 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.036634] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53751f6d-1cb6-435b-86c8-45385491b047 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.049333] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2354.057205] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2354.069644] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2354.069825] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2359.694501] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2359.694957] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Getting list of instances from cluster (obj){ [ 2359.694957] env[67964]: value = "domain-c8" [ 2359.694957] env[67964]: _type = "ClusterComputeResource" [ 2359.694957] env[67964]: } {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2359.696032] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b29245a5-7184-49ba-8270-af1946297451 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.709009] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Got total of 5 instances {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2359.831442] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2361.800434] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2362.795855] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2362.799503] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2362.799729] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2363.807747] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2363.808100] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2363.808100] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 2363.819411] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] There are 0 instances to clean {{(pid=67964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 2364.811470] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2364.811753] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2364.811837] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2366.801478] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2366.801863] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2366.801863] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2366.823260] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2366.823434] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2366.823540] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2366.823667] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2366.823800] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2366.823950] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2377.801949] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2377.801949] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Cleaning up deleted instances with incomplete migration {{(pid=67964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 2379.804629] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2380.347953] env[67964]: DEBUG oslo_concurrency.lockutils [None req-cb3d2f9f-9722-4ec9-a0d4-e037b66e4290 tempest-DeleteServersTestJSON-2048211470 tempest-DeleteServersTestJSON-2048211470-project-member] Acquiring lock "1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2389.672162] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2389.686596] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Getting list of instances from cluster (obj){ [ 2389.686596] env[67964]: value = "domain-c8" [ 2389.686596] env[67964]: _type = "ClusterComputeResource" [ 2389.686596] env[67964]: } {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2389.687867] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01534f88-f203-40bd-bd5c-e38a3c171c6a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.701266] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Got total of 5 instances {{(pid=67964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2389.701419] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 3e0e0504-9c76-4201-baf8-2d9636981f0c {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2389.701607] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2389.701765] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 2a0e1c08-8201-4ed7-9072-fdd90f25f120 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2389.701919] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5 {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2389.702090] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Triggering sync for uuid 1d709fb2-2bfe-463c-b39c-06e4a31cb0de {{(pid=67964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2389.702385] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2389.702605] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2389.702802] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "2a0e1c08-8201-4ed7-9072-fdd90f25f120" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2389.702996] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "1327b0f7-bc48-4475-8e12-7dcb7bcf28b5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2389.703207] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "1d709fb2-2bfe-463c-b39c-06e4a31cb0de" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2390.765931] env[67964]: WARNING oslo_vmware.rw_handles [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2390.765931] env[67964]: ERROR oslo_vmware.rw_handles [ 2390.766512] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2390.768443] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2390.768720] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Copying Virtual Disk [datastore1] vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/310469f6-8303-42bc-92f1-0b5b5df87ac3/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2390.769016] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1afe71e3-01f8-4fac-9e4c-8bc49bd5f8f4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.775840] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 2390.775840] env[67964]: value = "task-3456912" [ 2390.775840] env[67964]: _type = "Task" [ 2390.775840] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2390.783301] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456912, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2391.286298] env[67964]: DEBUG oslo_vmware.exceptions [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2391.286586] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2391.287148] env[67964]: ERROR nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2391.287148] env[67964]: Faults: ['InvalidArgument'] [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Traceback (most recent call last): [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] yield resources [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self.driver.spawn(context, instance, image_meta, [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self._fetch_image_if_missing(context, vi) [ 2391.287148] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] image_cache(vi, tmp_image_ds_loc) [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] vm_util.copy_virtual_disk( [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] session._wait_for_task(vmdk_copy_task) [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] return self.wait_for_task(task_ref) [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] return evt.wait() [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] result = hub.switch() [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2391.287565] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] return self.greenlet.switch() [ 2391.287877] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2391.287877] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self.f(*self.args, **self.kw) [ 2391.287877] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2391.287877] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] raise exceptions.translate_fault(task_info.error) [ 2391.287877] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2391.287877] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Faults: ['InvalidArgument'] [ 2391.287877] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] [ 2391.287877] env[67964]: INFO nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Terminating instance [ 2391.289726] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2391.289726] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2391.289726] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a8a12536-a9ff-4a17-9e5d-f4420ccba317 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.291785] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2391.291974] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2391.292685] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e5abb74-608c-4cc2-b258-d04309e85f21 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.299107] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2391.299327] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f9c3403e-599e-4284-892a-678fc4f31194 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.301376] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2391.301545] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2391.302454] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f7f7131f-594f-4478-9f04-3d7b70295318 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.308098] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 2391.308098] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52a6662c-b625-e7c2-a63d-78d69cd1ef55" [ 2391.308098] env[67964]: _type = "Task" [ 2391.308098] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2391.314914] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52a6662c-b625-e7c2-a63d-78d69cd1ef55, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2391.366621] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2391.366850] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2391.367061] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleting the datastore file [datastore1] 3e0e0504-9c76-4201-baf8-2d9636981f0c {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2391.367344] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b45ec32f-0cb9-4261-8cd9-15323c5b62bc {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.373867] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for the task: (returnval){ [ 2391.373867] env[67964]: value = "task-3456914" [ 2391.373867] env[67964]: _type = "Task" [ 2391.373867] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2391.381162] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456914, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2391.818571] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2391.818899] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating directory with path [datastore1] vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2391.819097] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bcad346c-b48d-43ee-8391-8e4b693f0afd {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.829840] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Created directory with path [datastore1] vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2391.830045] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Fetch image to [datastore1] vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2391.830237] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2391.830935] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8adb1f92-d735-48b5-aecf-286fa3ec5e02 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.837232] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ecabdee-b6f1-4dfd-93d4-725ee7717b56 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.845703] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efb6918a-1ac7-4cb2-9871-4b9fc2e1f87f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.877559] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6230e33c-5022-4cdb-b2c8-8a02ecb8fcc9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.884091] env[67964]: DEBUG oslo_vmware.api [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Task: {'id': task-3456914, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072573} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2391.885470] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2391.885663] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2391.885833] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2391.885999] env[67964]: INFO nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2391.887725] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e8e2ad42-f839-4fa1-8ffc-eddb95fb9e17 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.889563] env[67964]: DEBUG nova.compute.claims [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2391.889729] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2391.889972] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2391.909625] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2392.003352] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9a3d7ca-6a2e-4750-b009-93782ec230ce {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.012190] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0ada075-ab03-454d-a687-dd03e17fe384 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.045481] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9a1039b-b80d-4eda-84a5-29d1e9c91a0d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.052348] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aecca551-9fc4-48c5-b070-f3ca4950cca8 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.065278] env[67964]: DEBUG nova.compute.provider_tree [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2392.073412] env[67964]: DEBUG nova.scheduler.client.report [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2392.081554] env[67964]: DEBUG oslo_vmware.rw_handles [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2392.137455] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.247s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2392.137969] env[67964]: ERROR nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2392.137969] env[67964]: Faults: ['InvalidArgument'] [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Traceback (most recent call last): [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self.driver.spawn(context, instance, image_meta, [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self._fetch_image_if_missing(context, vi) [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] image_cache(vi, tmp_image_ds_loc) [ 2392.137969] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] vm_util.copy_virtual_disk( [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] session._wait_for_task(vmdk_copy_task) [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] return self.wait_for_task(task_ref) [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] return evt.wait() [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] result = hub.switch() [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] return self.greenlet.switch() [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2392.138276] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] self.f(*self.args, **self.kw) [ 2392.138610] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2392.138610] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] raise exceptions.translate_fault(task_info.error) [ 2392.138610] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2392.138610] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Faults: ['InvalidArgument'] [ 2392.138610] env[67964]: ERROR nova.compute.manager [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] [ 2392.138736] env[67964]: DEBUG nova.compute.utils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2392.141192] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Build of instance 3e0e0504-9c76-4201-baf8-2d9636981f0c was re-scheduled: A specified parameter was not correct: fileType [ 2392.141192] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2392.141601] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2392.141773] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2392.141939] env[67964]: DEBUG nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2392.142113] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2392.144301] env[67964]: DEBUG oslo_vmware.rw_handles [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2392.144491] env[67964]: DEBUG oslo_vmware.rw_handles [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2392.447734] env[67964]: DEBUG nova.network.neutron [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2392.463668] env[67964]: INFO nova.compute.manager [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Took 0.32 seconds to deallocate network for instance. [ 2392.560462] env[67964]: INFO nova.scheduler.client.report [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Deleted allocations for instance 3e0e0504-9c76-4201-baf8-2d9636981f0c [ 2392.581556] env[67964]: DEBUG oslo_concurrency.lockutils [None req-1b81847c-2cba-4737-8888-eb64404b4db8 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 597.998s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2392.582320] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 402.445s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2392.582320] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Acquiring lock "3e0e0504-9c76-4201-baf8-2d9636981f0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2392.582320] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2392.582507] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2392.584272] env[67964]: INFO nova.compute.manager [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Terminating instance [ 2392.585937] env[67964]: DEBUG nova.compute.manager [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2392.586070] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2392.586656] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6e078d32-9a0e-42de-ae26-daaa0621418c {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.595189] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f98d3b3-bd18-4ad1-8533-aa0de525222d {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.621251] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3e0e0504-9c76-4201-baf8-2d9636981f0c could not be found. [ 2392.621472] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2392.621649] env[67964]: INFO nova.compute.manager [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2392.621882] env[67964]: DEBUG oslo.service.loopingcall [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2392.622115] env[67964]: DEBUG nova.compute.manager [-] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2392.622207] env[67964]: DEBUG nova.network.neutron [-] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2392.645788] env[67964]: DEBUG nova.network.neutron [-] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2392.653793] env[67964]: INFO nova.compute.manager [-] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] Took 0.03 seconds to deallocate network for instance. [ 2392.739659] env[67964]: DEBUG oslo_concurrency.lockutils [None req-a61a45b4-07cf-4ffe-a3e1-424865688cd1 tempest-ServerDiskConfigTestJSON-988333261 tempest-ServerDiskConfigTestJSON-988333261-project-member] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2392.740477] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 3.038s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2392.740665] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 3e0e0504-9c76-4201-baf8-2d9636981f0c] During sync_power_state the instance has a pending task (deleting). Skip. [ 2392.740836] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "3e0e0504-9c76-4201-baf8-2d9636981f0c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.091858] env[67964]: DEBUG oslo_concurrency.lockutils [None req-de9e7aae-9263-4f3a-a852-3f947c2c2756 tempest-AttachVolumeTestJSON-932424003 tempest-AttachVolumeTestJSON-932424003-project-member] Acquiring lock "1d709fb2-2bfe-463c-b39c-06e4a31cb0de" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2414.801057] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2414.812435] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2414.812657] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2414.812820] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2414.812974] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2414.814112] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01394077-09c6-48a8-891c-fc3620491428 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.822089] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-809e661e-f853-4bff-8a42-e790efc644e4 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.836260] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15e8d663-b5f4-4feb-a344-6ce03c328da2 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.842246] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c28e289-d556-43dd-82b4-d755cd1a6fb6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.871604] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2414.871746] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2414.871930] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2414.997984] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.997984] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 2a0e1c08-8201-4ed7-9072-fdd90f25f120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.997984] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.997984] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Instance 1d709fb2-2bfe-463c-b39c-06e4a31cb0de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.998247] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2414.998247] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=67964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2415.013686] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing inventories for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:818}} [ 2415.027010] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating ProviderTree inventory for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:782}} [ 2415.027232] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Updating inventory in ProviderTree for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2415.037498] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing aggregate associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, aggregates: None {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:827}} [ 2415.054559] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Refreshing trait associations for resource provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=67964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:839}} [ 2415.107983] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a4831b0-47ae-468d-ac72-7830a8289929 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2415.115716] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-650ffd32-19a5-4d58-9d94-aad9d6750ffe {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2415.144321] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36627cef-4bd0-4419-bc83-444e1876fae1 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2415.151286] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95cd3285-92a5-4de4-ad24-404d040cdb50 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2415.164802] env[67964]: DEBUG nova.compute.provider_tree [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2415.172876] env[67964]: DEBUG nova.scheduler.client.report [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2415.186875] env[67964]: DEBUG nova.compute.resource_tracker [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2415.187072] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2420.187073] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2421.800507] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2422.796199] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2422.799799] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2423.800670] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2425.801641] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2426.801304] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2426.801603] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Starting heal instance info cache {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2426.801852] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Rebuilding the list of instances to heal {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2426.820169] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2426.820543] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2426.820543] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1327b0f7-bc48-4475-8e12-7dcb7bcf28b5] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2426.820543] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: 1d709fb2-2bfe-463c-b39c-06e4a31cb0de] Skipping network cache update for instance because it is Building. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2426.820711] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Didn't find any instances for network info cache update. {{(pid=67964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2426.821182] env[67964]: DEBUG oslo_service.periodic_task [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2426.821340] env[67964]: DEBUG nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2440.169284] env[67964]: WARNING oslo_vmware.rw_handles [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles response.begin() [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2440.169284] env[67964]: ERROR oslo_vmware.rw_handles [ 2440.170203] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Downloaded image file data b261268a-9800-40a9-afde-85d61f8eed6a to vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2440.171623] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Caching image {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2440.171863] env[67964]: DEBUG nova.virt.vmwareapi.vm_util [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Copying Virtual Disk [datastore1] vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk to [datastore1] vmware_temp/9286d415-f4ec-490d-b1c4-de34725059a9/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk {{(pid=67964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2440.172155] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f263c77c-fb08-4f6f-a0a2-a46c7c469055 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.179964] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 2440.179964] env[67964]: value = "task-3456915" [ 2440.179964] env[67964]: _type = "Task" [ 2440.179964] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2440.187436] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456915, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2440.690821] env[67964]: DEBUG oslo_vmware.exceptions [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Fault InvalidArgument not matched. {{(pid=67964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2440.691142] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2440.691702] env[67964]: ERROR nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2440.691702] env[67964]: Faults: ['InvalidArgument'] [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Traceback (most recent call last): [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] yield resources [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self.driver.spawn(context, instance, image_meta, [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self._fetch_image_if_missing(context, vi) [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2440.691702] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] image_cache(vi, tmp_image_ds_loc) [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] vm_util.copy_virtual_disk( [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] session._wait_for_task(vmdk_copy_task) [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] return self.wait_for_task(task_ref) [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] return evt.wait() [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] result = hub.switch() [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] return self.greenlet.switch() [ 2440.692389] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2440.693084] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self.f(*self.args, **self.kw) [ 2440.693084] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2440.693084] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] raise exceptions.translate_fault(task_info.error) [ 2440.693084] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2440.693084] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Faults: ['InvalidArgument'] [ 2440.693084] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] [ 2440.693084] env[67964]: INFO nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Terminating instance [ 2440.693584] env[67964]: DEBUG oslo_concurrency.lockutils [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b261268a-9800-40a9-afde-85d61f8eed6a/b261268a-9800-40a9-afde-85d61f8eed6a.vmdk" {{(pid=67964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2440.693787] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2440.694041] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7c153b20-fc27-4e13-a68c-40c29ad42dfb {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.697362] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2440.697597] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2440.698427] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-184ebafc-a151-4200-a39a-1d6c49e641d6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.705473] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Unregistering the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2440.705703] env[67964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b5085cb0-ef68-4815-a419-1c38aa8a7391 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.707946] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2440.708132] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2440.709088] env[67964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-43f8faeb-47e0-4461-9bd7-034f645eccb6 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.713635] env[67964]: DEBUG oslo_vmware.api [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Waiting for the task: (returnval){ [ 2440.713635] env[67964]: value = "session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52435c29-c544-8da5-6315-2f4501410d54" [ 2440.713635] env[67964]: _type = "Task" [ 2440.713635] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2440.721014] env[67964]: DEBUG oslo_vmware.api [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Task: {'id': session[523a18dc-76cd-0d0f-7e22-2d061d090d43]52435c29-c544-8da5-6315-2f4501410d54, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2440.775617] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Unregistered the VM {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2440.775848] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Deleting contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2440.776012] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleting the datastore file [datastore1] aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2440.776288] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8557a3c9-84db-4692-be32-1c92ea4f261a {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.782284] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for the task: (returnval){ [ 2440.782284] env[67964]: value = "task-3456917" [ 2440.782284] env[67964]: _type = "Task" [ 2440.782284] env[67964]: } to complete. {{(pid=67964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2440.789720] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456917, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2441.223444] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Preparing fetch location {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2441.223790] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating directory with path [datastore1] vmware_temp/361bde80-7395-4243-b491-64a4e310fdf4/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2441.223896] env[67964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c9dddb8a-2c2a-4a97-b660-6fac3249c88b {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.235022] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Created directory with path [datastore1] vmware_temp/361bde80-7395-4243-b491-64a4e310fdf4/b261268a-9800-40a9-afde-85d61f8eed6a {{(pid=67964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2441.235206] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Fetch image to [datastore1] vmware_temp/361bde80-7395-4243-b491-64a4e310fdf4/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk {{(pid=67964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2441.235372] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to [datastore1] vmware_temp/361bde80-7395-4243-b491-64a4e310fdf4/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2441.236070] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb975e18-a5b5-48b4-b391-2ffb486633a7 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.242262] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b3b586f-4e9d-4956-ae2c-d50957c3f9f5 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.251129] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85ea3a95-d5a7-4ea6-9deb-6c275e5f426f {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.281590] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-748da4da-02a5-405b-8ffa-4e07d84dc21e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.291740] env[67964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1d0a74d1-a1d8-4bdb-bec7-72c99d46636e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.293330] env[67964]: DEBUG oslo_vmware.api [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Task: {'id': task-3456917, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086609} completed successfully. {{(pid=67964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2441.293552] env[67964]: DEBUG nova.virt.vmwareapi.ds_util [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleted the datastore file {{(pid=67964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2441.293727] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Deleted contents of the VM from datastore datastore1 {{(pid=67964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2441.293891] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2441.294068] env[67964]: INFO nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2441.296105] env[67964]: DEBUG nova.compute.claims [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Aborting claim: {{(pid=67964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2441.296248] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2441.296459] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2441.318046] env[67964]: DEBUG nova.virt.vmwareapi.images [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] [instance: 2a0e1c08-8201-4ed7-9072-fdd90f25f120] Downloading image file data b261268a-9800-40a9-afde-85d61f8eed6a to the data store datastore1 {{(pid=67964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2441.379948] env[67964]: DEBUG oslo_vmware.rw_handles [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/361bde80-7395-4243-b491-64a4e310fdf4/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2441.439594] env[67964]: DEBUG oslo_vmware.rw_handles [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Completed reading data from the image iterator. {{(pid=67964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2441.439790] env[67964]: DEBUG oslo_vmware.rw_handles [None req-782e42df-3119-4a72-b2d9-42d022b6663b tempest-ServersTestJSON-385895298 tempest-ServersTestJSON-385895298-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/361bde80-7395-4243-b491-64a4e310fdf4/b261268a-9800-40a9-afde-85d61f8eed6a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2441.470487] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b361394-90b3-4384-8b75-25e5ce0f0b5e {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.477847] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbd83019-4446-43b4-a295-62b2d003c494 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.506760] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-539513d8-aa73-4e09-879f-466d3842cbe9 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.513567] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-104dd0c0-5851-45c7-bae6-209e20882256 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2441.527299] env[67964]: DEBUG nova.compute.provider_tree [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed in ProviderTree for provider: 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 {{(pid=67964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2441.535489] env[67964]: DEBUG nova.scheduler.client.report [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Inventory has not changed for provider 2c116cee-6c2d-4cdd-b5f2-5697c0d45f41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:954}} [ 2441.548296] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.252s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2441.548807] env[67964]: ERROR nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2441.548807] env[67964]: Faults: ['InvalidArgument'] [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Traceback (most recent call last): [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self.driver.spawn(context, instance, image_meta, [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self._fetch_image_if_missing(context, vi) [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] image_cache(vi, tmp_image_ds_loc) [ 2441.548807] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] vm_util.copy_virtual_disk( [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] session._wait_for_task(vmdk_copy_task) [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] return self.wait_for_task(task_ref) [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] return evt.wait() [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] result = hub.switch() [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] return self.greenlet.switch() [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2441.549401] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] self.f(*self.args, **self.kw) [ 2441.549991] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2441.549991] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] raise exceptions.translate_fault(task_info.error) [ 2441.549991] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2441.549991] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Faults: ['InvalidArgument'] [ 2441.549991] env[67964]: ERROR nova.compute.manager [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] [ 2441.549991] env[67964]: DEBUG nova.compute.utils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] VimFaultException {{(pid=67964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2441.551192] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Build of instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 was re-scheduled: A specified parameter was not correct: fileType [ 2441.551192] env[67964]: Faults: ['InvalidArgument'] {{(pid=67964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2441.551588] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Unplugging VIFs for instance {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2441.551769] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2441.551935] env[67964]: DEBUG nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2441.552106] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2441.872829] env[67964]: DEBUG nova.network.neutron [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2441.885797] env[67964]: INFO nova.compute.manager [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Took 0.33 seconds to deallocate network for instance. [ 2442.000615] env[67964]: INFO nova.scheduler.client.report [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Deleted allocations for instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 [ 2442.023758] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e47ef354-df4b-4ec1-b84b-30e732a242ab tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 631.726s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2442.024080] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.446s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2442.024307] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Acquiring lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2442.024519] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2442.024689] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2442.027436] env[67964]: INFO nova.compute.manager [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Terminating instance [ 2442.029114] env[67964]: DEBUG nova.compute.manager [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Start destroying the instance on the hypervisor. {{(pid=67964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2442.029308] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Destroying instance {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2442.029807] env[67964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-009e79ea-371e-46b4-bd65-5be4c584b540 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2442.042815] env[67964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dea3bf99-6be2-41c7-a5ea-ab5ef707e368 {{(pid=67964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2442.075572] env[67964]: WARNING nova.virt.vmwareapi.vmops [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aa9c54a7-7b81-45cb-9f53-2016f4ea4b72 could not be found. [ 2442.075951] env[67964]: DEBUG nova.virt.vmwareapi.vmops [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Instance destroyed {{(pid=67964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2442.076242] env[67964]: INFO nova.compute.manager [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Took 0.05 seconds to destroy the instance on the hypervisor. [ 2442.076570] env[67964]: DEBUG oslo.service.loopingcall [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2442.076903] env[67964]: DEBUG nova.compute.manager [-] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Deallocating network for instance {{(pid=67964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2442.077144] env[67964]: DEBUG nova.network.neutron [-] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] deallocate_for_instance() {{(pid=67964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2442.124868] env[67964]: DEBUG nova.network.neutron [-] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Updating instance_info_cache with network_info: [] {{(pid=67964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2442.141843] env[67964]: INFO nova.compute.manager [-] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] Took 0.06 seconds to deallocate network for instance. [ 2442.247249] env[67964]: DEBUG oslo_concurrency.lockutils [None req-e136eedd-a9ff-406a-9030-605e786dff75 tempest-ImagesTestJSON-270378177 tempest-ImagesTestJSON-270378177-project-member] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.223s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2442.248524] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 52.546s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2442.248836] env[67964]: INFO nova.compute.manager [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] [instance: aa9c54a7-7b81-45cb-9f53-2016f4ea4b72] During sync_power_state the instance has a pending task (deleting). Skip. [ 2442.249038] env[67964]: DEBUG oslo_concurrency.lockutils [None req-76fb6de6-059f-43f6-897d-38e6548d4012 None None] Lock "aa9c54a7-7b81-45cb-9f53-2016f4ea4b72" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}}